Emote control

In an imposing Robert Adam townhouse on London's Fitzroy Square,
I'm frantically jumping, kicking the air, waving my arms and
intermittently cursing as I gyrate awkwardly before a crowd of
strangers. No, this is not some misguided attempt to attract a
bartender's attention at a private member's club: I'm facing a
television monitor, above which a video camera and an infrared
sensor are tracking movement in 48 points around my body, then
instantly converting my every twist and stretch into impressively
responsive motion by an on-screen replica of me. I initially feel
self-conscious as I chuck imaginary balls into the screen to
destroy a wall in a satisfyingly simple video game called Ricochet.
But within minutes I've forgotten that I ever used a Wiimote or a
joystick to interact with a game. My body has become my remote
control.

Welcome to Microsoft's next billion-dollar play: a radical new
approach to Xbox 360 gaming in which your movements, your voice, even your
facial expressions are the means by which you relate to the
on-screen action. The Xbox 360 Kinect, due to hit shops in time for
Christmas, heralds what the company calls "a new paradigm on how we
interact, entertain and play at home". As the project's ballsy
creative director, Kudo Tsunoda, puts it: "We've got the best new
controller in the history of gaming. The controller is you."

Now, Microsoft is notorious for unlocking unsurpassed levels of
hyperbole when it comes to promoting its game launches. So I'll be
wearing my hype-deflecting reality cloak when attending Kinect's
gaspingly over-choreographed "world premiere" at the E3 tradeshow
in LA later this month. Yet I'm in no doubt that something profound
is happening here with social and behavioural implications that go
way beyond video games. We are, quite simply, entering a new era of
sentient devices, with intelligent technologies responding reliably
to our physical movements, our spoken commands, even our brain
patterns before we start to move. And that rewrites all the rules
about how man relates to machine.

A bunch of trends are colliding to make this happen.
Body-mapping software has now reached that futuristic Minority
Report stage whereby a mere wave of your arm lets you zoom
into a map or change a TV channel. Face-recognition technology will
not only estimate your age and ethnic origin, it also seeks to
predict your mood. EEG-based brainwave electrodes are starting to
let disabled people use thought to control wheelchairs. Then
there's voice recognition, which even today ensures that my Google Nexus One phone can pretty accurately transcribe as I'm
talking - before translating whole sentences instantly into Chinese
or German and then reading out those sentences (think Stephen
Hawking in a Mandarin accent). When I visited Google's headquarters
in Mountain View, California last summer, Sergey Brin told me over
lunch (now that's a phrase I've waited my whole career to write)
that within 20 years Google will have mastered artificial
intelligence to the extent that you won't know if you're talking to
a person or a 'bot.

Eradicating the game handset is just a small part of that
revolution - but one which starts to suggest how we'll be
interacting with all sorts of devices. "All interfaces will become
much more naturalistic and intuitive, and it's not just gaming,"
says Ashley Highfield, Microsoft's UK MD. "It's the ability to
control TV through gestures such as hand-waving, with the system
knowing that because there are two of you on the sofa you'll want
different EPGs. Once a device recognises you, puts you into the
event, then everything changes. It's pretty profound."

You can get a sense of what's already possible by searching YouTube for a video demonstration of
Milo, an interactive on-screen character being developed for
Kinect by British game creator Lionhead Studios. Milo talks back
intelligently to a real-life woman chatting to him through the TV
screen; he even reads her facial expressions to detect her mood.
And although there's a certain online scepticism about the veracity
of the demo's real-time action, Peter Molyneux, the Lionhead
Studios boss whose team is leading much of the Kinect work, sees
the potential as vast. "It's not just motion control, it's vision,
it's recognising you, it's hearing you," he says. "For me as a
designer, I'm having to go back to school. One thing that will be
created by Kinect, and I'll bet money on it, is whole new genres of
games [not] defined by controllers."

So why does this matter if you don't play games? Because all
this tech is bridging the gap between our digital lives and the
real world - and in a way that needs no handset. Remember the
Minority Report scene in which Tom Cruise waves his hands
through the air to interrogate a computer database? Two MIT
students, Tony Hyun Kim and Nevada Sanchez, have just developed
augmented-reality gloves that let you navigate and manipulate a map
simply by waving your hands - at a cost of just £65. Nearby, at
MIT's Media Lab, researcher Pranav Mistry has bolted together a
webcam, a battery-powered projector and a mirror to track gestures.
Turn your fingers into a picture frame and the camera takes a
photo. Hold an airline boarding pass and you'll be directed to the
departure gate. Draw an "@" in the air and up flashes e-mail.

Mistry calls his device "SixthSense". Sure, that's one more than
humans are meant to have. But if it helps our other five senses
interact more effectively with the outside world, who wouldn't want
an upgrade?

David Rowan

David Rowan is GQ's technology columnist and editor of Wired. "What obsesses me is how new technologies change our behaviour - not the devices themselves, but what they mean for us as social creatures, or storytellers, or consumers," he says.