The Gestural and Audio Interactions for Mobile Environments (GAIME) project is looking at how subtle bodily gestures or actions could enable someone to interact with mobile devices while their hands are busy.

By the end of the three-year, EPSRC-funded project the team hopes to have developed a system that is truly ‘hands free’. The user won’t have to press buttons or look at a screen to access information, and the mobile device itself will relay information via 3D sound in a user’s headphones.

‘People can talk on the phone when walking in the street, dodging people — and it works well,’ explained Prof Stephen Brewster, principal investigator of GAIME. ‘But if they are texting they often walk slower, bump into people or just stop. We want to make interactions more like talking and less like texting.’

He explained that at the beginning of the project, gestures were tracked using accelerometers that are already found in products such as the iPhone and the Nokia N95. These accelerometers are currently used to detect the orientation of the phone, but Brewster believes more information could be extracted from them for other uses.

In addition to accelerometers the team is also looking at adding more sensors to a user’s body, to expand the types of gestures he or she can measure and then use them for input.

‘Eventually we want to have more sensors around the body,’ said Brewster. ‘With a headset you could accept or reject a call by nodding or shaking your head, or with other sensors you could even use a change in gait to control a device. We are also looking at putting something on the shoulder or in a shoe like the Nike Plus sensor used with the iPod.’

According to Brewster, the shoulder or wrist could have vast potential as sites on the body for controlling devices. By wearing a sensor on the shoulder it might be possible to change a track on a music player by simply shrugging.

The social aspect of gesture control is also being taken into account by studying how people react to performing the gestures, as well as seeing them performed.

‘The key challenge is to make it socially acceptable,’ said Brewster. ‘We pretty much accept Bluetooth devices now and it’s common to see someone with a headset, so adding an extra sensor shouldn’t be too hard.

‘We need to be able to pick out the gestures from the “noise” of walking but we don’t want people to look too strange as they walk down the street. Obviously, being able to recognise big movements is easier but they are more embarrassing to do. We want a situation where nobody notices you are doing them. We don’t want a scene like Monty Python’s Ministry of Silly Walks.’

The project is also looking to build on previous work the team did with the Audioclouds project, which investigated at the use of 3D sound as an output for devices. By creating ‘audio space’ around a user, a larger area is created in which to display more auditory information than is possible with standard mobile devices.

‘People are very good at listening to specific things by blocking other noises out,’ said Brewster. ‘It’s what we call the dinner party effect. We are looking at having multiple devices playing at the same time that we can combine with displays or tactile output to give richer, more sophisticated interactions. At the same time we have to be careful we don’t overpower people with sound.’

To further increase safety in use, the researchers are planning to use bone conductance headphones that deliver sound by vibrating the bone around the ear. This will leave the ears uncovered, allowing the user to stay alert to the environment around them.

In addition, Brewster is looking at how the device could identify the context in which it finds itself and decide which gestures or output are inappropriate.

If, for example, a person is laden down with bags they will not be able to gesture with their arms, and if they are running the sound output will be reduced to minimise distractions.