SoundAffects: generative music on the streets

Last spring I was approached by the always-awesome Tellart to consult on a fairly unique generative music street installation, to be deployed in lower Manhattan for Parsons The New School for Design. There was going to be a long wall on 5th Ave rigged with cameras and sensors of all sorts, data visualization on the website, and a 24/7 streaming soundtrack. The only question was: what will it sound like?

The result was SoundAffects, and we’re all quite proud. I was able to bring in my fledgling Ruby-based generative music system for Ableton Live, Loom (still in pre-alpha development at the time), which led to the construction of a larger, more elaborate Live-based system for dynamic input-based composition. And after brainstorming for hours about what effects weather changes and circadian rhythms have on our musical desires, my friend / Tellart employee Jasper Speicher and I had some pretty excellent times fleshing out these ideas on drums and keyboards in the studio.

In fact, you can see that in the behind-the-scenes video, as well as how we explored urban space through amateur surveillance, using cheap USB cameras and San Francisco’s Kearny St as a substitute for 5th Ave.

For more samples of the generated music, the SoundAffects site is still up, and has a really slick archive so you can listen back to highlights from when the experiment was up back in May. All I’ll say about NY last May is, well, I’m glad we factored in precipitation data. !