Beyond Kinect: Gestural computer spells keyboard death

THE advent of multi-touch screens and novel gaming interfaces means the days of the traditional mouse and keyboard are well and truly numbered. With Humantenna and SoundWave, you won't even have to touch a computer to control it, gesturing in its direction will be enough.

These two technologies are the latest offerings from Microsoft, who gave us the Kinect controller. But the Kinect hardware looks clunky next to the Humantenna and SoundWave setups, which their inventors say could be built into a watch or laptop.

As the name suggests, Humantenna uses the human body as an antenna to pick up the electromagnetic fields - generated by power lines and electrical appliances - that fill indoor and outdoor spaces. Users wear a device that measures the signals picked up by the body and transmits them wirelessly to a computer. "It's just an electrode that measures voltage, digitises it and sends the signal for processing," says Desney Tan of Microsoft Research in Redmond, Washington.

By studying how the signal changes as users move through the electromagnetic fields, the team was able to identify gestures, such as a punching motion or swipe of the hand. In all, the researchers found that the technology could detect 12 gestures with over 90 per cent accuracy.

One version of the system, presented this week at the Conference on Human Factors in Computing Systems in Austin, Texas, runs off a sensor that sits in a small bag. With training, that sensor can learn to recognise specific gestures. Another paper, under review, describes a version that relies on a much smaller wristwatch-sized sensor. Thanks to advances in processing techniques, this newer system needs no training to recognise the same 12 gestures.

The team was able to do away with training after realising that low-frequency components of the signal are similar, no matter which electrical objects are producing them. By focusing on these common patterns, the system can detect the same gesture even when it is performed in a different location with different electromagnetic fields. "That's a pretty big step," says Tan.

All sorts of applications would open up if Humantenna can be commercialised. The body could become a kind of universal remote control, and basic gestures such as pointing or swiping might be used to control lights, appliances and computers in the home. Fitness monitoring is another possibility, says Tan. We already have devices that can infer how hard a person is exercising by tracking step patterns, but Humantenna could provide a more holistic measure by monitoring whole body movements.

"It's a very cool idea," says Joseph LaViola, who studies user interfaces at the University of Central Florida in Orlando.

But LaViola says he is not sure how robust the system will be. Changes to local electromagnetic fields as devices are switched on and off might confuse Humantenna. The system might also struggle to differentiate between closely related gestures - something that Tan agrees will be a challenge. Although the technology can detect movements that cover about 5 to 10 centimetres, it will not be able to pick up smaller gestures like the wriggling of a finger.

Simple as it is, Humantenna still requires users to wear a sensor. But Tan's team, working in collaboration with researchers at the University of Washington in Seattle, has developed another gesture-recognition device that will need no new hardware.

SoundWave, which is also being presented in Austin, relies on an inaudible tone generated by a laptop loudspeaker. When a hand moves in front of the laptop, it changes the frequency of the sound, which the machine's microphone picks up. By matching characteristic frequency changes with specific hand movements, SoundWave can detect a handful of gestures with an accuracy of 90 per cent or more, even in noisy environments such as a cafeteria.

Interference caused by the tone bouncing off nearby objects will limit the ability to detect fine-grained motion. But the technology will still be able to translate coarse movements, such as a swipe, into commands. "I'd love to lean back and swipe to get the next page," says Tan. "Or to push a window out of the way by moving my hands." His team has already used SoundWave to control scrolling and to wake up a laptop when a user approaches it.

Laptops tend to come with built-in speakers and a microphone, so SoundWave could be rolled out as soon as the software is fine-tuned. "If you don't need extra hardware that's a big jump in terms of getting it to the masses," says LaViola.

SoundWave and Humantenna are steps towards a future in which we interact with computers everywhere we go, says Tan, without typing on a keyboard or clicking a mouse. Right now, both technologies respond only to fairly vigorous gestures, but later iterations will be tuned to react to gestures that are closer to those we use in everyday communication. "We want a universal way of interacting with computers, and gestures will be a big part of that," says Tan.

Shhh... touch lips to turn off phone

Imagine switching a phone to silent mode just by pressing a finger to your lips. This is one possible application of Touché, a system that turns any object, including the human body, into a touch interface.

Researchers at Disney Research in Pittsburgh, Pennsylvania, built Touché by sending a small current through everyday objects. A person touching the object changes the flow of electricity depending on the type of touch, be it a finger or a firm grasp.

The system can also send low levels of current through the human body. The current changes when users clasp their hands or touch their faces. The signal generated could one day be used to control a phone or other electronic device.

If you would like to reuse any content from New Scientist, either in print or online, please contact the syndication department first for permission. New Scientist does not own rights to photos, but there are a variety of licensing options available for use of articles and graphics we own the copyright to.