The Sancho Plan creative development

Thinking about future TSP creative development and collecting my thoughts…

In general I’d like:

  • an overall plan for future The Sancho Plan development that was more proactive and less reactive
  • or at least a collective and considered response to forthcoming opportunities – though we need to be clear about what these are and what might be required of us accordingly. I’d also like to clarify and agree what being a ‘collective’ – if we actually are one – means for us in terms of roles, delegation of responsibility and decision making processes…
  • or a plan for my own personal contribution to future TSP development that was more proactive and less reactive
  • some resources to fund my creative explorations

There are three creative ideas I’d particularly like to explore – my main concern being how they fit within a current TSP landscape…

cymatics and buiding a tonoscope – both physical and virtual

  • I’ve briefly discussed the idea with Anthony Rowe from squidsoup who is I think an ideal partner for this project and he’s very interested. I’ll outline our initial discussions here shortly.
  • Ultimately the aim would be to create a beautiful and musically responsive visualisation device and/or piece of programming that could be used in TSP live gigs and/or exhibitions
  • monome

  • I really want to get hold of one of these interfaces and play with it – and I’d be perfectly happy with a second hand 40h. I think the potential for creative and expressive control of sound and visuals through this minimalist piece of hardware – itself a beautifully designed object – is fantastic. If only I had ~£250…
  • For live work the unit could be filmed from above, capturing the live performance of interaction with the interface – the 8×8 grid of backlit buttons being visually interesting in themselves – but more so since they’re also programmable.
  • What I think TSP could add is in creating an augmented reality – overlaying animations triggered by the on/off, pattern and sequence outputs of the LEDs
  • I’ve suggested the idea of a monome R&D project to Gini Simpson at SPACE Studios and she’s supportive but has no resources to offer – so it will mean writing an ACE application myself… ho hum…
  • a live MIDIfied band

  • I’ve been saying for ages that I’d like to get back to playing bass guitar… and MIDIfying it… and I reckon that to actually implement the idea so that it’s technically trouble free and sounds good… as well as practising enough to get back up to speed after 18 months of not even owning a bass… let alone actually rehearsing enough together to sound tight as a live rhythm section… is a fairly major undertaking that will take a good 3-6 months to get right.
  • I also reckon we should be adding MIDI guitar (Edd?) and a slung keyboard and/or MIDIKat – and possibly even more unusual MIDI instruments such as woodwind (Edd again?) and even some of the more bizarre hybrid instruments we’ve come across. I also think all of these should be wireless – using these wireless MIDI devices
  • The potential to be able to play dub to punk to electroclash (or any style we choose) literally at the flip of a switch is very exciting… but will take time and practice to get right…
  • I discussed this idea of a MIDIfied band to Christopher Lindinger, Research & Innovation Director at Ars Electronica Futurelab and he seemed genuinely enthused. And though he made it clear that Futurelab has limited R&D funds he thought that a relatively modest project with tangible outcomes that was also of interest to the festival could be possible. We need to think about the idea more and draw up a proposal…

General areas I think we need to move forward on…

interactive lighting

  • I chatted with Jason Bruges this week who is as ever happy to meet up to discuss ideas – he also mentioned a lighting engineer within his team who happens to be a drummer! This guy seems keen to meet us and discuss ideas…
  • I think we should be looking to develop lighting solutions for: drums, costumes, stage – and in that order…
  • Arduino and Bluetooth may well be an option – but I’d definitely bow to Jason’s advice here… and while I’m sort of interested in the electrical engineering side of things I’m not really that good with a soldering iron (if I were to attempt some electrical engineering of my own I’d prefer trying to extend the functionality of the monome 40h).
  • narrative and character development

  • What is going on here? Has anything more come from Ranko or Maurice?
  • Personally I don’t think we can claim to create work that “explores the real-time interaction between music and video AND its potential for narrative and storytelling” as things are. We talk the talk… but don’t walk the walk…

As far as my focus on audio work goes I’d like to…

  • get back to audio production – select from and work up existing demos into more arranged and produced versions – Monkey Men being a prime example here – as a track it’s quite well developed and is ready for more audio production but its languishing for want of the animation work in Maya…
  • work up new ideas from my recent k610i recordings into Ableton
  • involve Olly V in track development – get him up and running in Ableton and feed him some of these new demo tracks to contribute to.

Other things to think about…

  • Ars Electronica installation
  • Arts Council England application
  • Martyn Ware/FOS
  • Submissions to festivals

Things I forgot to add initially:

trying to create a closing fourth section to Spacequatica with circles of synchronised swimming creatures

  • need latest Nuendo file from audio machine
  • MIDIRemote update from Adam
  • latest Flash & Director files and some info and instruction from Ed on using it
  • Ross PhillipsThe Replenishing Body interactive video installation for SHOWStudio which he presented at the iDesign lunchtime talk

  • I can see real potential in this as a tool for performance – playing distorted versions of ourselves – I’ve met with Ross and have made the request

Comments are closed.