Share:

Researchers in Canada have designed a family of prosthetic musical instruments, including an external spine and a touch-sensitive rib cage, that create music in response to body gestures (+ interview + slideshow).

The instruments developed are a bending spine extension, a curved rib cage that fits around the waist and a visor headset with touch and motion sensors.

Spine - attached to the back

Each instrument can be played in a traditional hand-held way, but can also be attached to the body, freeing a dancer to twist, spin and move to create sound. All three are lit from within using LEDs.

"The goal of the project was to develop instruments that are visually striking, utilise advanced sensing technologies, and are rugged enough for extensive use in performance," explained Malloch and Hattwick.

The researchers said that they wanted to create objects that are beautiful, functional and believable as instruments. "We wanted to move away from something that looked made by a person, because then it becomes less believable as a mysterious extension to the body," Hattwick told Dezeen.

"The interesting thing would be either that it looks organic or that it was made by some sort of imaginary futuristic machine. Or somewhere in between," he added.

Visor - worn on the head

The Rib and Visor are constructed from layers of laser-cut transparent acrylic and polycarbonate. "One of the layers uses a transparent conductive plastic film, patterned with the laser cutter to form touch-sensitive pads," said Hattwick.

The pads are connected to electronics via a thin wire that runs through the acrylic. Touch and motion sensors pick up body movements and radio transmitters are used to transmit the data to a computer that translates it into sound.

Rib - fitted around the waist

The Spine is made from laser-cut transparent acrylic vertebrae, threaded onto a transparent PVC hose in a truss-like structure. A thin and flexible length of PETG plastic slides through the vertebrae, allowing the entire structure to bend and twist. The rod is fixed at both ends of the instrument using custom-made 3D-printed components.

"We used 3D printing for a variety of purposes," Hattwick told Dezeen. "One of the primary uses was for solving mechanical problems. All of the instruments use a custom-designed 3D-printed mounting system, allowing the dancers to smoothly slot the instruments into their costumes."

Speaking about the future of wearable technology, Hattwick told Dezeen: "Technological devices should be made to accommodate the human body, not the other way around."

"Just as we've seen an explosion of DIY musical instruments and interactive art based on open-source electronics, perhaps we will see an explosion of DIY mechanical devices which create new ideas of how we use our body to interact with technology."

Here's a 15 minute documentary about the Instrumented Bodies project that features the instruments in action:

The team are now working to develop entirely 3D printed instruments and to radically re-imagine the forms that instruments can take.

Here's the full interview with PhD researchers Joseph Malloch and Ian Hattwick:

Kate Andrews: Why did you embark on this project? What was the motivation?

Ian Hattwick: This project began as a collaboration between members of our group in the IDMIL (specifically Joseph Malloch, Ian Hattwick, and Marlon Schumacher, supervised by Marcelo Wanderley), a composer (Sean Ferguson, also at McGill), and a choreographer (Isabelle Van Grimde).

In 2008 we worked with the same collaborators on a short piece for 'cello and dancer' which made use of a digital musical instrument we had already developed called the T-Stick. We decided to apply for a grant to support a longer collaboration for which we would develop instruments specifically for dancers but based loosely on the T-Stick.

Instrumented Bodies - digital prosthetics sketches

During the planning stages we decided to explore ideas of instrument as prosthesis, and to design instruments that could be played both as objects and as part of the body. We started by sketching and building rough prototypes out of foam and corrugated plastic, and attaching them to the dancers to see what sort of movement would be possible - and natural - while wearing the prostheses.

After settling on three basic types of object (Spine, Rib, and Visor) we started working on developing the sensing, exploring different materials and refining the design.

Kate Andrews: What materials are the spine, rib and visor made from?

Ian Hattwick: Each of the Ribs and the Visors is constructed from a solvent-welded sandwich of laser-cut transparent acrylic and polycarbonate. One of the layers uses a transparent conductive plastic film, patterned with the laser cutter to form touch-sensitive pads.

The pads are connected to the electronics in the base of the object using very thin wire, run through laser-etched grooves in the acrylic. The electronics in the base include a 3-axis accelerometer, a ZigBee radio transceiver, circuitry for capacitive touch sensing, and drivers for the embedded LEDs. Li-Ion batteries are used for power.

Each of the Spines is constructed from laser-cut transparent acrylic vertebrae threaded onto transparent PVC hose in a truss-like structure. One of the rails in the truss is a thin, very flexible length of PETg plastic that can slide through the holes in the vertebrae, allowing the entire structure to bend and twist. The PETg rod is fixed at both ends of the instrument using custom 3D-printed attachments.

For sensing, the Spines use inertial measurement units (IMUs) located at each end of the instrument - each a circuit-board including a 3-axis accelerometer, a 3-axis rate gyroscope, a 3-axis magnetometer, and a micro-controller running custom firmware to fuse the sensor data into a stable estimate of orientation using a complementary filter.

In this way we know the orientation of each end of the instrument (represented as quaternions), and we can interpolate between them to track or visualise the shape of the entire instrument (a video explaining the sensing can be watch on Youtube). Like the Ribs and Visors, the Spine uses a ZigBee radio transceiver for data communications and LiPoly batteries for power.

All of the instruments use a custom-designed 3D-printed mounting system allowing the dancers to smoothly slot the instruments into their costumes.

A computer equipped with another ZigBee radio transceiver communicates with all of the active instruments and collects their sensor data. This data is processed further and then made available on the network for use in controlling media synthesis. We use an open-source, cross platform software library called libmapper (a long term project of the IDMIL's - more info at www.libmapper.org) to make all of the sensor data discoverable by other applications and to support the task of "mapping" the sensor, instrument and gesture data to the parameters of media synthesisers.

The use of digital fabrication technologies allowed us to quickly iterate through variations of the prototypes. To start out, we used laser-cutters at the McGill University School of Architecture and a 3D printer located at the Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT). As we moved to production we outsourced some of the laser-cutting to a commercial company.

Kate Andrews: How did collaboration across disciplines of design, music and technology change and shape the project?

Ian Hattwick: From the very beginning of the project, the three artistic teams worked together to shape the final creations. In the first workshop, we brought non-functional prototypes of the instruments, and the dancers worked with them to find compelling gestures, while we tried a variety of shapes and forms and the composers thought about the kind of music the interaction of dancers and instruments suggested.

Later in the project, as we tried a variety of materials in the construction of the instruments, each new iteration would suggest new movements to the dancers and choreographer. Particularly, as we moved to clear acrylic for the basic material of the ribs, the instruments grew larger in order to have a greater visual impact, which suggested to the dancers the possibility of working with gestures both within and without the curve of the ribs.

These new gestures in turn required the ribs to have a specific size and curvature. Over time, the dancers gained a knowledge of the forms of the instruments which gave them the confidence to perform as if the instruments were actual extensions of their bodies.

Component tests

Kate Andrews: How was 3D printing used during the project - and why?

Ian Hattwick: We used 3D printing for a variety of purposes in this project. One of the primary uses was for solving mechanical problems - such as designing the mounting system for the instruments.

We tried to find prefabricated solutions for attaching the instruments to the costumes, but were unable to find anything that suited our purposes, so we designed and prototyped a series of clips and mounts to find the shapes that would be easy for the dancers to use, that would be durable, and that would fit our space constraints.

In addition, 3D printing quickly became a tool which we use any time we had a need for a custom-shaped mechanical part. Some examples are a threaded, removable collar for mounting the PET-G rod to the spine, mounting collars and caps for the lighting in the spine.

[A document detailing the use of 3D printing in the project can be downloaded here].

Instrumented Bodies - digital prosthetics sketches

Kate Andrews: Where do you see this technology being used now?

Ian Hattwick: 3D printing, or additive manufacturing as it is known in industry, is increasingly commonplace. In the research community, we’ve seen applications everywhere from micro-fluidic devices to creating variable acoustic spaces. One of my favourite applications is the creation of new homes for hermit crabs.

Kate Andrews: Can we expect to see other live performances using the instruments?

Ian Hattwick: We are currently working with the instruments ourselves to create new mappings and synthesis techniques, and in October we will bringing them to Greece to take part in a 10 day experimental artist residency in Greece focusing on improvisation. We've also been talking with a variety of other collaborators in both dance and music, so we expect to have quite a few different performances in the next year.

Kate Andrews: What do you think is the future for interactive and wearable technology?

Ian Hattwick: I'm really excited about the coming generations of constantly worn health monitors, which is the first widespread adoption of the ideas of the "quantified self" movement. I expect in a relatively short time it will be normal for people to maintain logs of more than just their their activity, heart rate, or sleep patterns, but also the effect of their mood and environment on their body. I’m also excited about e-textiles, clothing which can change its shape or visual appearance.

One of the ways in which I see the prosthetic instruments making a real contribution is the idea that technological devices should be made to accommodate the human body, and not the other way around. Particularly, you see musical instruments created so as to be easy to mass-manufacture, rather than seeking to identify and support natural physical expressions during musical performance. At the same time, by creating technologies which are invisible to the performer we take away the physical interaction with an instrument which is so much a part of how we think about performance, both individually and in ensembles.

Kate Andrews: Does this present a new future for music? For dance?

Joseph Malloch: There is no one future for music or dance, but we can always count on new technologies being adapted for art, no matter their intended purpose.

Ian Hattwick: In interactive dance, the paradigm has always been capturing the unencumbered motion of the dancer; in music, there tends to be a fetishisation of the instrument. So in a sense, the idea of prosthetic instruments challenges the existing norms of those art forms. Certainly, using the prosthetic instruments requires a different conceptualisation of how we can perform dance and music at the same time.

The challenges of working with prosthetic instruments can be strongly appealing, however, and the level of mechanical sophistication which is provided by new generations of digital manufacturing will create opportunities for artistic exploration.

Just as we’ve seen an explosion of DIY musical instruments and interactive art based on open-source electronics, perhaps we will see an explosion of DIY mechanical devices which create new ideas of how we use our body to interact with technology.

Kate Andrews: What are you working on now?

Ian Hattwick: Documentation: We work in academia, and publication of in-depth documentation of our motivations, design choices, and insights gained throughout the process of development is an important part of the work. We are part of a much larger community of researchers exploring artistic uses for new technologies, and it is important that we share our experiences and results.

Mapping: The programmable connections between the gestures sensed by the instruments and the resulting sound/media really define the experiences of the performers and the audience. We are busy finding new voices and modes of performance for the prostheses.

Improvements to hardware and software: In particular, sensing technology advances very quickly, with price, quality, and miniaturisation constantly improving. There are already some new tools available now that we couldn't use three months ago.

3D printing musical instruments: We are talking with a 3D printer manufacturer about developing acoustic instruments which are entirely 3D printed, and which take advantage of the ability to manipulate object’s internal structure as well as radically re-imagining the forms which musical instruments can take.