Archive for the ‘Processing’ Category

My Arduino workings

Saturday, July 17th, 2010

I’ve been working away on a couple of projects recently – the Auduino synthesiser and DMX control of a Colour Kinetics iColor Flex SLX 50 node light string – and though I’m not quite ready to post about these yet I thought some of my interim research and workings worth noting…

DMX-mixer-2010-07-17-09-55.jpg

Arduino to DMX

I found a recent Frizing project – ARDUINO TO DMX CONVERTER – which uses the same RS485 chip as Daniel Hirschmann’s Super DMX Shield – which I bought all the components for from Farnell bar the 8 Pin DIP IC Socket (doh!) – so although this was all PC based I thought it might worth looking at in the interim.

(more…)

Processing – HUD and tweens with ijeomamotionlib

Sunday, March 21st, 2010

lightwavevirtual_hud_500px.jpg

In recent posts I flagged up some possible next stage developments for my Light Wave Virtual sketch with OCD and ControlP5…

  1. “…control over the Perspective Changes & Linear Movements… could be useful to set up some oscillators to do this job…“
  2. ”…[develop a] persistent HUD display showing a Monomatic ‘Live’ console with all the real-time input from our performance instruments and controllers displayed graphically etc…“
  3. ”… Combinations [could be useful] – which could actually be an XY pad – as per the ‘ControlP5diyController.pde’ example…“
  4. ”…make these a set of generic controls too – so their their outputs are switchable between cameras – perhaps by radio button?“

and I’ve generally had success – or stopped trying – with each of the above

(more…)

Processing – More PeasyCam, OCD & ControlP5

Sunday, March 14th, 2010

I’ve now managed to copy camera and centre of interest (c.o.i.) positional values from PeasyCam into OCD (bar a slight perspective shift I’m not quite able to resolve yet)… and I’ve also managed to update the slider positions for each of the controlP5 controllers (the controlP5 secondary window has to be in focus to see the changes) and thus save these values to XML and recall them later… yay!

lightwavevirtual_controlp5_500px.jpg

float[] CamPposition = new float[3];
CamPposition = CamP.getPosition();
Cam1_cameraX = CamPposition[0];
ControlP5.controller(“Cam1_cameraX”).setValue(Cam1_cameraX);

I’m now exploring the full range of OCD methods to see how the library allows to – “manipulate individual cameras using standard camera movement commands” – and to see which might be useful additions to my ControlP5 live camera configuration.

(more…)

A Personal Philosophy & Approach to Audiovisualisation

Sunday, March 7th, 2010

Wearing my Cybersonica hat I’ve started a regular monthly open forum for sharing creative coding and practice In association with MadlabInProcess:ing.

I gave a ‘show and tell’ presentation about one of my own Processing sketches at the launch event – though Kyle McDonald’s presentation on his 3D Scanning technique was the main attraction – and since it seems to have outlined my general philosophy and approach to audiovisualisation reasonably well I thought I’d include an edited transcript here.

There’s fairly poor quality video documentation on the Cybersonica Vimeo site… with useful links in the session output posting

tome-wilkinson_light-wave_large.jpg
lightwavevirtual_500px.jpg

Spot the Difference: Tom Wilkinson’s Light Wave and Prodical’s Light Wave Virtual

(more…)

Processing – Viewport Cameras – PeasyCam & OCD conflicts?

Sunday, February 14th, 2010

I’d been having some issues mixing the PeasyCam and Obsessive Camera Display (OCD) libraries in the same Processing sketch – which I’ve now sort of resolved and learned a thing or two in the process…

PeasyCam_home_500px.jpg

I found that switching between a PeasyCam and non-configured OCD viewport camera feed made PeasyCam behave idiosyncratically – loss of mouse control, some offsetting of Z-axis distance (only on the first switch back from an OCD feed – it’s consistent from then on), clipping of the scene – as if the farClip setting in OCD has likwise adjusted it in PeasyCam – though clipping isn’t actually assignable within PeasyCam itself.

Turns out PeasyCam is now at v0.8.1 – I was running an older version – and the documentation now includes a number of methods I hadn’t noticed before.

(more…)

3D Scanning

Tuesday, January 5th, 2010

I’ve come across this really interesting 3D scanning technique before – but I’ve now spent a bit more time following the various links and exploring it in more detail…

ThreePhase-500px.jpg

I’m keen to try it for myself and wonder if a ‘micro’ projector such as the Optoma Pico PK 101 DLP and IPEVO Point 2 View USB webcam and/or ‘tricked out’ PS3 Eye Webcam could be a compact and affordable solution – and possibly for live performance too?

There’s a good overview of the technique in the May 2008 post Turn a webcam and projector into a 3D scanner which also provides some Processing applets and code examples and lists useful references and links.

(more…)

Problems with MIDI and Java after update to Java for Mac OS X 10.6 Update 1 on OS X 10.6.2?

Saturday, December 12th, 2009

Since the Apple software update to Java for Mac OS X 10.6 Update 1 I’ve been having problems running some of my Processing sketches – particularly those using MIDI – and Java applets such as SevenUpLive 1.4.

JavaPreferencesOSX10.6.x500px.jpg

I’ve also been experiencing some memory access errors e.g. “Invalid memory access of location 0x0 eip=0x917108d0” with monoControl and some of my own sketches – which I think are issues with OS X, Java and RAM?

A googled forums.sun.com post and referenced Processing forum post seems to indicate I’m not alone.

(more…)

Importing 3D models into Processing

Sunday, November 22nd, 2009

This was a fairly convoluted journey…. but I think I’ve managed to uncover a relatively simple method for importing 3D models into Processing… with help from Marius Watz of the excellent Code & Form: Computational Aesthetics blog via the “rapid prototyping” STL format.

unlekkerSTLReadWrite_lewis.jpg

Marius has provided a really useful selection of libraries, code snippets and workshop example sketches at his website and at the Code and Form code repository and has now also released his own unlekkerLib library – “a collection of utility classes that I use frequently, and which I’ve attempted to clean up enough for other people to use. Currently, the most significant features are the STL class for exporting 3D geometry for rapid prototyping and the TileSaver class for outputting high resolution stills from OpenGL applications.”

(more…)

A long overdue momome.org update

Sunday, November 1st, 2009

It’s been a while since I’ve spent a bit of time browsing and catching up with developments at the monome.org community site – though I’m always interested to see what’s been going on – and typically a few things caught my eye:

MonoControl-0.5.2-500px.jpg

MonoControl – “an adaptable and flexible midi control application written in Processing for Ableton Live (or any other DAW) which, if configured correctly, should report all controller changes to MonoControl which then adjusts the LEDs on the monome. The app uses the buttons of the far rhs column of the monome to navigate through 8 different pages – and in each you can create combinations of faders, crossfader, x-y faders, push, toggle and note buttons and button matrixes – all stored in an an xml file.” (my somewhat edited version of the app description)…

This is exactly what I’ve been hoping to develop myself using Nick Rothwell’s shado…so I started to implement it into my Mirror2 real-time video processing Processing sketch and it’s working well…

(more…)

Processing – what next I wonder?

Sunday, June 7th, 2009

I was having a bit of an issue focussing this morning and couldn’t quite decide what to work on next. Hmmm… “So many ideas, so little time”…

motionlibpendel_500px

An example from the motion library “Motion simplified” by Yonas Sandbaek

I spent some time in Processing looking at my Virtual Light Wave sketch – tidying up the code e.g. switching from keyCode to key for keyboard presses (as advised in the Processing Refrence) and inadvertently sorting out the issue of the appearance of unwanted lines splaying out from the ‘vanishing point’ when zooming in too far by implementing the farClip and nearClip methods in the camera() data-type as part of the Obsessive Camera Direction (OCD) library – which actually does what it says on the tin and “allows intuitive control and creation of Processing viewport Cameras”.

(more…)