Hive #2

Hive #2 is a research and development project looking at the development of sonic ecologies and sound environments which respond to the viewer’s movements around a space and which grow and develop over time – either a set time frame such as a day, or over the duration of an exhibition. The intention is to base these ecologies on systems found in nature – hopefully the behaviour of insect swarms, although this information has so far proved harder to come by than the already modelled behaviour of herding animals and flocks of birds (Craig Reynolds’ Boids Distributed Behavioral Model).

The intention is for the work to be responsive to visitor behaviour, not in a way which foregrounds ‘interactivity’, but a more subtle, biological response. Early intentions for the project were to use a series of sensors within the space to record the movement of visitors within the space, but this has now developed into using a ceiling-mounted video camera to record the movement of visitors and to use that information to trigger behaviours within the sound environment.

The project utilises Cycling 74’s program suite Max/MSP (http://www.cycling74.com) as the central controller for interactive installation works. Max/MSP is a graphical programming environment for audio, video and multimedia. Processing is also used within the work to monitor data gathered by the video camera and send information about movement within the space to Max/MSP. Processing is an open source programming language and environment for people who want to program images, animation, and interactions.

A key component of the project has been to expand my programming skills, by working with mentors to develop the work. Early research and development was conducted with the help of North Wales based programmer Ben Campbell, but I’ve also been working closely with Matt Jackson from the Creative Sound and Music course at University of Wales, Newport.

More about Hive #2

In case it's of use to anyone, here's the Processing code I've been using to detect motion in a space using a web cam, and to send that information to Max/MSP. The data is sent via Open Sound Control (OSC), so you'll need to have installed this in Max before you start (it's available here: http://cnmat.berkeley.edu/downloads). Although Matt Jackson tells me you can just use the UDP Receive object in Max.
The code is largely based on code supplied by Matt Jackson - http://jacksonmatt.wordpress.com/
I've since moved on to use a 'blob' detection solution, based on the Open Computer Vision framework (http://ubaa.net/shared/processing/opencv/) instead of this code - the data being sent to M [...] Read more about Motion Detection with Processing & OSC