Student Researchers at Sonification Lab Bring New Sound to Data

November 11, 2013

Students engrossed in research at Georgia Tech’s Sonification Lab tackle some serious challenges as they explore ways to create auditory interfaces to assist visually impaired people or improve traffic safety. Occasionally they get the opportunity to employ research in odd, different ways.

Doctoral candidates Jonathan Schuett, Jared Batterman and Vincent Martin collaborate to create an auditory-enhanced version of Fantasy Football, which allows vision-impaired fans, including Martin, to join the competition.

Another unusual test of the new auditory technology came with an assignment originating with the reggae-rock group Echo Movement. The students made good on the band’s request to convert the fluctuating brightness of stars into sound, which the band then incorporated into a song.

"It's the first time I've heard of a research lab getting credit in a CD liner note," says Associate Professor Bruce Walker, with a laugh.

Walker, who holds appointments as a professor in both psychology and interactive computing, directs the Sonification Lab, a joint Georgia Tech effort between the School of Psychology and the School of Interactive Computing.

At the lab, mixed teams of undergraduate and graduate students develop auditory and multimodal user interfaces through sonification, which is the use of sound to display and analyze scientific data. Lab researchers also address the cognitive, psychophysical and practical aspects of the technology.

According to Walker: "We'll have technically proficient programmer-type students working with psychology students who know about memory, cognition and perception, along with students in digital media, architecture, engineering, music and other fields. Over the course of the project they're expected, first of all, to learn how to talk to each other, and then learn some of the skills that the other team members possess."

At any given time, the lab may have up to 40 students divided among 10 project teams, with some students serving on more than one team. Those numbers may increase under an emerging plan to offer lab participation as a design studio class.

The auditory interfaces produced by the Sonification Lab target numerous potential applications in education, vehicles, electronic devices and complex task environments. They are particularly well-suited as an assistive technology for people with vision impairments.

Typically, blind students use conventional computers equipped with text-to-voice software that reads aloud text displayed on a screen. But this approach is of little use for working with certain visual representations, such as graphs.

The Sonification Lab is developing interactive software that allows blind students to render and manipulate graphs in one of two ways, according to Walker. One approach is to simply describe aloud the entire contents of the graph, like an extended caption. Another approach involves speaking the graph's text elements — numbers, title, letters and so forth — while the data points are represented by musical notes. Walker illustrates with an example of a simple daily outdoor temperature comparison graph.

"Let's say my data goes from an average daily temperature of 50 degrees to 70 degrees," he explains. "That's a 20-degree span. I can use 20 notes — about two octaves — to represent the data, so that each note going up the scale represents one degree of temperature. If the temperature goes up five degrees, I'll move up five notes."

The challenge comes when using large numbers that frequently change, like the temperature of the sun's surface. "It becomes trickier mapping from the data onto the musical notes, but we can still do it," Walker says.

In addition, the software must be interactive so that blind students can pause playback or move back and forth through the data with a mouse, keyboard or other input device.

• Education. In addition to its utility for people with visual impairments, innovative auditory displays could enhance a number of learning activities both inside and outside the classroom. Potential applications include science centers, museums, galleries and aquariums.

• Vehicles. Auditory user interfaces in vehicles would enhance traffic safety by enabling drivers to operate secondary equipment — music players, radios, navigational devices, communications — without distracting them from driving. Multimodal displays may also help novice, tired or frustrated drivers perform better, and help drivers with special disabilities (e.g., traumatic brain injury) to safely operate an automobile.

• Electronic devices. Enhancing visual displays with sound — or in some cases relying on sound alone — could significantly boost the use of a range of electronic devices, from cell phones to home appliances. Stand-alone projects in this category include a wearable, audio-only navigation device.

• Human-Computer Interaction. The main interest in HCI work at the lab is the development of non-traditional multimodal interfaces for highly specific, unusual or difficult task domains. Applications include submarine controls and displays, space station tasks, medical procedures and military systems.

• New science. Because sophisticated auditory displays and sonification are relatively new fields, researchers are investigating how the design of these technologies must conform to the cognitive and psychological factors that influence the ways in which people listen. Lab scientists are also interested in learning how to train listeners to use and interpret auditory displays.

The Sonification Lab’s interdisciplinary approach is hard-wired into its operational structure because, as Walker notes, "no one discipline is sufficient" for solving these complex problems.

"Creativity comes from diversity," he adds. "Creative solutions come from understanding a broad landscape of possible solutions. The more diverse your team is in terms of culture, academic background, skills, knowledge and interest, the more likely it is that you're going to find a durable solution."