For the last year, on and off, I’ve been collaborating with the visual artist Sharon Kelly. Sharon has a keen interest in running and she draws on this in her art. The collaboration fell out of a chance conversation we had about the possibility of fitting her up with some accelerometers while she was running and doing something visual with the data that generated afterwards.

The process we settled on was me giving Sharon a wii-mote and a netbook, when Sharon went out for a run she would set the netbook up and put it in her backpack and carry the wii-mote like a baton in her hand. The netbook was running Glove Pie which took the wii-mote’s accelerometer data in via bluetooth and output it to a Processing sketch via OSC. The sketch just timestamped and recorded the raw data generated by Sharon’s hands while running. Here is what part of the raw data looks like in Excel

Periodic data plotted

I took this raw data and wrote a second Processing sketch that attempted to animate the data in a style that complemented Sharon’s pencil drawing, it takes the accelerometer data, scales it and then animates it in 3D, as the pen width doesn’t vary with depth it gives the impression of a 2D drawing. Here’s a video of it running.

I showed this sketch running on Sharon’s iMac at her February show in the Crescent Arts Centre.

Sharon spent some time in the gallery with the sketch running and as well as finding it quite hypnotic became interested in using it as a source of inspiration for sketching out the forms she saw in it. Exploring this example of pareidolia became the focus of the next stage of the collaboration. We arrived at the idea of exploring the data looking for visually ‘interesting’ sections of the data which could then be used as inspiration for more abstract works.

This led me to rewrite the early Processing sketch in MaxMSP/Jitter and create a standalone visualiser I could hand over to her so she could move through the recorded data, explore it and experiment with different scalings and projections. I tried to give the sketch a stop motion style / pencil drawn effect that was inspired by Sharon’s work. Here is an example of her accelerometer data being visualised by it.

The next stage of the project which will be occurring in PS2 next week as part of the rehearsal rooms project will involve projecting these data visualisations and making stop motion videos with Sharon, inspired by the shapes and motion inherent in original data. I met Sharon last week in her studio to work on the set up, here are some early photos.

Projected visualisation and response sketch

Stop motion set up

Web cam taped to charcoal and bamboo contraption.

The last photo is of an interesting contraption that came out of working together in the same room for the first time, it’s a webcam on a piece of charcoal that led to some interesting animations.

Charcoal cam animation one

Charcoal cam animation two

Charcoal cam animation three

Charcoal cam animation four

Charcoal cam animation five

What I found interesting about the process has been the theme of imitation, of my visualisation attempting to imitate her work and in the next stage that of her drawings imitating mine, each working iteratively towards some middle ground between us. It’s been really great working together, an open ended exploration with lots of back and forth. I’ll post materials from next week’s gallery time as they arise.

After meeting Peter, who runs Belfast’s PS2 gallery, about showing some of my PhD pieces there, he asked me to do ‘something’ with their ping pong table for Belfast Culture Night. Naturally I thought to myself ‘I’ll piezo mic it and do bonk detection’ and use the player’s actions to drive some interactive sound and light mood altering music machine. This boiled down to ‘I’ll stick an Arduino in it’ (this seems to be a pattern in my projects) and drive some pretty LEDS that can react to the ball hitting the table and use a Max patch to control the whole sha-bang as well and putting out some pleasant reactive sounds.

All this led me to getting messy with some contact/piezo/transducer mics, the first time I’d used them though I’ve seen them in numerous interactive projects as they’re pretty handy for simple bonk detection. I spent a week fooling about with them gaffa taped to the underside of a plastic garden table which was all I had to hand at the time. The outcome of the was a ground breaking equation governing the relative loudness of ping pong balls on plastic as a function of distance which proved rather less useful when I moved the contact mics on to Peter’s table.

Prototype garden table, at that point I’d given up and filled the table with synthesisers instead.

After I’d got reliable bonk data coming into Max I started with the lights. I had a limited budget which I decided to blow almost entirely on ultrabright RGB LEDS which I sourced quite reasonably from Rapid. I used a 16 channel TLC9540 PWM current sink to control 5 groups of 3 RGB LEDS in series, I picked the TLC9540 because Alex Leone’s well written Arduino library for the chip. The LED’s were powered from a spare DC multi-adaptor supply that I had lying around, 12V sufficed (I find it best to avoid electrocuting the public wherever possible).

A word to the wise, if you get your own TLC9540 and you’re not careful to set the dot brightness low (and you have no way of knowing what it gets scrambled to when you power up) you can easily sink enough current to trigger the built in thermal protection which turns off the chip till it cools down. I found a cheap heatsink that was wider than the chip which I stuck on the back of it using the same thermally conductive double sided sticky tape I later used to stick the LEDs to their heatsinks.

The only Arduino code I had to write was a simple serial library to convert commands from Max into commands for the TLC. I always like to write these kind of things from scratch because I enjoy the challenge of it and they never normally take very long. I generally just base them around my understanding of MIDI (i.e. a command space for values above a certain number and a data space for values below then use extra packets and bitshifts to send larger values if necessary).

At that stage I had an array of bright lights I could control the intensity and colour of and a method for getting data in. As culture night loomed I needed away of sticking them safely to the underside of a ping pong table, luckily I’d been put in touch with the excellent guys at Farset Labs who lent me use of their glue gun and some old aluminium strips. I spent a very happy and very late night using some heat conductive double sided sticky tape (it was from Maplin and designed for sticking heat sinks on GPUs) gluing the LEDS to the bars and wiring the whole thing up.

A couple of days before Culture Night I set up the table in PS2 with sensors and lights for some serious play testing and mapping design. In the end I tacked the aluminium strips to the underside of the table and used the ubiquitous gaffa tape to hold the cables in place. This is a video I took when I was getting the lights set up for the first time.

I had initially had all sorts of ideas about how to use the sensor data to control Ableton Live and even created a drum machine that kept tempo with the tapping of the ball back and forth. In the end I went for a more literal approach and decided to actually use the acoustic signal from the piezo mics for more than just bonk detection but to actually generate the sound itself. I achieved this by feeding banks of tuned resonators and a custom Reaktor patch that I made years ago that does interesting things with interpolated delay lines. Each of the sensors fed its own effects chain and I also mixed in some live signal from a microphone I hung above the table to pick up the natural acoustic sound of the ball and the audience, this was just fed through some EQ and delays. I think it ended up sounding like a mix between Basic Channel and Autechre, which is no bad thing in my opinion. This is a recording of a game that I made.

The mapping worked using the location system I’d established earlier, I let the position of the ball strike along the length of the table control the chords the resonator was programmed to play and the colour of the LEDS such that the table had a red and blue end with the spectrum in between. Where the ball landed across the width of the table affected the panning of Live’s master output. I also used the peak amplitude of Live’s output to control the intensity of the lights as they flashed and faded after each ball strike, this was a really nice effect that tied the sound and light together. Here’s a video of the final installation on Culture Night.

Looking at the piece technically the whole thing ran on a combination of Live hosting all the Reaktor vsts and processing the acoustic signal while Max used the same signal to do bonk detection and control Live and the Arduino, if I had more time I’d have tried to squeeze the whole thing into a single M4L patch but to be honest I find communication between instances of M4L patches to be pretty unpredictable timing wise so it might have to stay as two separate applications with OSC and Midi doing the communicating.

Putting it in its artistic context there a whole host of interesting ping pong projects. Some of my favourites are Kings of Ping, Ping Tron and the spookily contemporary Noisy Table.

Pretty pleased with this one as it feels like an improvement both in terms of tune and mixdown. It’s based on the skit I posted here Hangover acid medley. This is the first track I did on my Atari using Notator, normally I’m a Cubase 3.1 kind of guy but I thought I’d see how the other half live (or should that be lived given development stopped in ’93). Notator’s really good for track layout and development, the only problem is the piano roll note editor doesn’t display long notes very well, makes programming long chordal stuff a bit tricky at times. I can see why people flicked between both.