4-17th November 2008
Arrived at STEIM with a basic JunXion/LiSa system controlled via two Nintendo Wiimotes and a single Nunchuck setup for live sampling from two stereo sources. One Wiimote is strapped to the headstock of my guitar with accelerometer data mapped to playback volume, pitch, and fuzz parameters, the second dangles from the peg of my guitar strap along with the connected nunchuck for ‘pick up and play’ activities. Scrub parameters are mapped to the nunchuck joystick with separate pitchbend and fuzz controlled by accelerometers, whilst buttons are used to trigger record and ‘region linked’ playback zones, as well as ’state changes’ in JunXion. A couple of external pedals connected to my Boss GT8, when combined with the ‘ctl’ switch on this unit allow me to MIDI trigger one extra record zone and two region linked playback zones with my feet.
Once I reunited the neck of my guitar with it’s body (which was disassembled for travel) my residency began in earnest in an upstairs ‘atelier’, which complete with natural daylight offered an abundance of typically Dutch exercise (lots of stairs!) and proved to be a very productive working environment. Initially I developed a Max/MSP patch to run alongside LiSa/JunXion, so the LiSa patch outputs three stereo audio channels via an ADAT cable into Max, I then began experimenting with granular software developed by Nathan Wolek to further process the audio. It quickly became difficult to negotiate multiple streams of audio, so I introduced a mixing desk style architecture to more easily blend signals and choose what to granulate via auxiliary outputs etc, at this point I also started working with a demo of Max version five and rebuilt the patch much more neatly, encapsulating various portions to better display audio levels/routing status, etc.
Bumped into Atau Tanaka and a very intricate Max patch, his new biosensor setup using the wireless bluetooth I-CubeX looks very cool, and may yet prove to be an obvious way forward in building a more personal system permanently housed within the body of my guitar. But what inspired me most was Jorgen’s intricate use of velcro to hold this instrument together, so after a brief visit to the Albert Cuypmarket to buy velcro I set about experimenting with appropriate placements for mounting the wiimotes on my guitar.
The position of the nunchuck joystick now allows me to scrub through playback buffers whilst still manipulating the guitar strings, whilst the horn mounted wiimote is conveniently accessible to both hands, this is particularly useful for recalling various preset patch setups/switching modes etc. Previously the wiimote/nunchuck felt very separate to my guitar/footpedal combinations, everything is now now more accessible and guitar centric (especially the accelerometers) offering multiple layers of interaction, all simultaneously available. I can obviously still grab the wiimote apparatus and perform with this entirely separate to the guitar, but when attached, my attention is focused back much more into the guitar AS the ‘instrument’, even if I want to modify a parameter unrelated to it (like a prerecorded sample in LiSa) manipulating this by adjusting the angle at which the guitar hangs from my body, feels very intuitive. But how to make music with this? For the second half of my residency I moved into studio one (a much bigger soundproofed space) which allowed me to make a lot of noise 24 hours a day and investigate the recurring question of ‘how’ to perform and ‘play’ electronically mediated musics.
I now have plenty of options with which to work:
LiSa sample playback (both immediate/recent and/or from prerecorded sample library)
Granuarlised output (fed ‘prefade’ in varying amounts by either guitar/pedals or LiSa)
These three strands can be outputted and recalled discretely or in various combinations, alternatively they can be selected via a random JunXion table mapped to accelerometer data, which means I never know for sure what component of the system will be exposed next, although big gestures do have a tendency to result in silence. This, as well as helping Bennett Hogg solder and drill holes in his violin kept me busy right up to the ‘open studio’ presentation where I played a short duo with Robert van Heumen and discussed our collaborative project Whistle Pig Saloon. It was good to test the system and talk with an engaged audience about some emerging themes: resistivity, confusion, sonic immersion etc, with very inspiring presentations from all concerned. Sabine Vogel played a really beautiful flute/LiSa/wiimote set, delicate and totally captivating, talking after the performance she said something along the lines of ‘what you did is exactly what i don’t want to do’ !!! I think this was meant nicely and she liked my ‘dancing’, so took as a complement, very interesting to hear such different usage of our similar technologies. Also learned that connecting JunXion/LiSa/Max via the IAC MIDI bus, instead of directly to each other makes the system much more stable (having previously had problems with Max assigning MIDI ports using random numbers).
After this session, whilst learning and developing the system further I added some high and low pass filtering to the granular section of the Max patch, when mapped to accelerometers this seems to make the system less messy, or at least easier to ‘learn’. I also made some recordings with ‘Tron Lennon’ which I hope will produce enough material for an album. In the final days of my residency I used the ‘pan 4s’ object to make a four-channel version of the patch, this uses wiimote data for multichannel audio spacialisation between 4 loudspeakers. Then inspired by Alex Nowitz’s vocal performance at ‘open studio’, I decided to try sampling my voice, and made some solo live recordings with electric guitar, pedals, voice, LiSa and Max MSP (4 channel, not multi-track). A few edits latter ‘Sunday at STEIM’ is the result.
mp3 can be streamed here:
Thanks to everyone at STEIM for this amazing opportunity.