Whole Body Computing

Desney Tan wants to make cyborgs. The senior researcher for Microsoft's Computational User Experiences group quips: "People often look at me and say, 'Did the Microsoft guy really just say that?'"

Tan's research focuses on the melding of man and machine, specifically using the entire human body as an input device. "We believe the human body is very evolved to be a 'high-bandwidth' device," he says. "And computers currently just don't take advantage of most of that bandwidth." The way most people currently interact with computers is with the mouse and keyboard, which have been with us for nearly as long as the personal computer itself. In Tan's world, however, any body part is fair game. Fingers tapping against different areas of the body might make keyboard-like inputs through bioacoustic sensors. Brain wave sensors could determine if a user is concentrating deeply, and block distractions like incoming e-mails. Armbands could be wired to detect muscle movement, interpreting them as commands. Tongues could be used to control wheelchairs through special dental retainers, a potential boon to paraplegics.

Tan recognizes the difficulties of turning this vision into reality. He says that scanning the brain's electrical signals for useful information is a little like trying to figure out what is going on inside a computer by taking a thermometer and pressing it against the plastic case. But he is optimistic that his custom-programmed algorithms can amplify and clean up those signals. And regardless of the exact method employed, increasing the bandwidth between humans and their machines will create a greater richness and depth to the experience.