The interface consisted of 1 Launchpad grid controller and 2 iPads, all of which were sending info to Ableton Live (music) and from there, on to CoGe (video). When the data hit Ableton Live, it was used to trigger sounds and adjust parameters and effects. Some of that data was then forwarded on to CoGe to adjust visual effects, and some of that data was used to generate new data that was then forwarded on to CoGe. At the same time, Ableton Live was choosing which video to trigger based on probability equations. I know, it sounds complicated, but it worked.

Mara Leader created all of the video content, Tim Hackett contributed to the music and designed and built the physical part of the interface and made it look all pretty and Star Wars-y, and I did the music and programming. We designed the interface such that the crowd passing by was able to step in and run the show by intuitively playing the grid and touch devices.

This is a recording of a test of the system. The real thing looked kinda like this only better.