SoundAffects [MaxMSP, Sound, Arduino]

On Fifth Avenue, a listening wall has jacks built in, so passersby can plug in earphones and listen to street activity translated into music. Sensors built into the wall detect movement, proximity, temperature, weather, cell phone activity, noise, color and light. Online and at night, a data visualizer translates the street activity and plays along with the music. The music, and the corresponding visuals and video, are available in near real-time through the browser or on your phone. The composition is navigable using the timeline, which is marked with the “experiments” scheduled throughout the week.

What would we learn if we changed the way we looked at our cities? What if, instead of just looking at them, we could listen to them? SoundAffects was an experiential project by Parsons The New School for Design. It turned everyday things, from weather and traffic to color and motion, into their own musical sounds.

The team has also made the data available to the public. Go here to download their CSV file if you would like to create your own compositions. The project was created using HTML5, JavaScript, Max/MSP/Jitter and Ableton Live. On the hardware side, Tellart rigged industrial sensors to an Arduino micro-controller and will be capturing live video throughout the ten days.