Processing – what next I wonder?

I was having a bit of an issue focussing this morning and couldn’t quite decide what to work on next. Hmmm… “So many ideas, so little time”…

motionlibpendel_500px

An example from the motion library “Motion simplified” by Yonas Sandbaek

I spent some time in Processing looking at my Virtual Light Wave sketch – tidying up the code e.g. switching from keyCode to key for keyboard presses (as advised in the Processing Refrence) and inadvertently sorting out the issue of the appearance of unwanted lines splaying out from the ‘vanishing point’ when zooming in too far by implementing the farClip and nearClip methods in the camera() data-type as part of the Obsessive Camera Direction (OCD) library – which actually does what it says on the tin and “allows intuitive control and creation of Processing viewport Cameras”.

I looked into building more solid ‘arms’ by using rect() instead of line() – but its a different data-type with alternative methods – likewise quad() and box(). And trying to make the amount of movement responsive to input – such as audio through the Minim library – just doesn’t seem possible through my current implementation of Jack Perkin’s frame rate independent oscillator sketch. I think this is ultimately required – but I’m going to have to go back to first principles to make it happen. So I spent a little time looking at openprocessing.org for inspiration and a starting point – and typically found some interesting sketches – but nothing quite right for this – so I installed and started to play with the motion library “Motion simplified” by Yonas Sandbaek and can see the potential already.

I did add a variable strokeWeight – which gives a wireframe to more solid look and used OCD to set up five different camera feeds switchable on key press – which I think will be a good way to use this sketch for music visualisation – flipping between extreme close up viewpoints and then slowly working back to reveal more detail and ultimately the actual virtual kinetic sculpture.

I can actually see OCD providing a really useful way to control and cut between camera views in Processing – and so I extrapolated Nick Rothwell’s implementation of ControlP5 for the lighting in our MonoScape visualisation sketch and successfully set up control from a second sketch for one of the cameras – managing to implement both a setupControls() function – Nick’s code to automatically build and layout sliders – and my own loadControls() function to load the configuration saved to XML. I tried to ‘tidy up’ the code in the sketch by moving the ControlP5 code and Nick’s Java AWT second screen code into separate tabs… and then repeated this work in the default lighting sketch I stripped out from the MonoScape visualisation sketch.

I generally think it would be good idea to build a default framework for our Processing sketches that have second screen, lighting and monome control functionality implemented from the outset (and with OCD, camera’s too!).

I tidied up the MonoScape visualisation sketch – there was a lot of commented out code with the development being a bit of a last minute – the main job being to extract the ProXML library code as a way to save controller state – since ControlP5 actually has it built in. I also updated the SVG of the graphic monome buttons – making them a fraction smaller and adding a green (not sure about the colour) outline for a button press state. I coded this into the sketch – in terms of creating a PShape itsPressed, defining it as a child of the SVG itsPressed = itsShape.getChild(PRESS_PATH); and setting it to invisible – itsPressed.setVisible(false); – but I’ll wait for Nick to actually implement the button presses. I couldn’t get the code Nick wrote to display a graphic number card of each sound pressed running without slowing down the sketch to a snails frame rate – there must be something amiss – but I did convert the monomatic font to bitmap and loaded and displayed a number series as design test in the sketch – and apart from it looking a bit rough round the edges (we may end up using PNGs) – I think aesthetically it’ll work.

I also ran my old liquid spill damaged white Macbook all day with its replacement fan, logic and inverter board and thankfully it seems to be fine… touch wood veneer (I was genuinely worried I’d spent a lot of time and money trying to resurrect it – and had failed). All being well I’m intending for this to be my main production machine for music and video – since it actually has a Firewire port 😉

Comments are closed.