Gesture computing is here! (And it's out of control!)

As the dust settles over Las Vegas, it's becoming clear that this year's International CES ushered in a new era of in-the-air gesture control.

TVs, tablets, phones, cars that enable you to control them by waiving, pointing and generally moving your hands in the air were demonstrated at CES.

As Microsoft proved with Kinect for Xbox 360, gesture control is a wonderful interface that really works. But there's one problem, possibly a fatal one for the acceptance of gesture control: There are no standard gestures.

Every new gesture-control device makes you learn a whole new "vocabulary" of hand motions.

The new world of gesture control

The number of companies offering the technology is extensive and growing:

Samsung at CES announced new TV sets with built-in cameras and microphones, as well as an in-air gesture and voice control system called Smart Interaction. Another company called PointGrab, which has been making gesture control interfaces for Samsung TVs, announced a new version at the show.

There's Leap Motion, which has been getting a lot of attention for technology that uses hand motions to control a PC or laptop user interface or some other device. The company's technology is very open-ended and extensible, making it most useful for further customization by software makers.

Elliptic Labs showed off a user interface for Windows 8 PCs. It provides gesture control like many other products, but uses ultrasound instead of cameras to determine the location of your hands.

Spicebox at CES showed a really interesting hardware-software product called Mauz. You plug a dongle into your iPhone and install the Mauz app. It turns your phone into a mouse for your Mac. More than that, however, it enables you to control the Mac with Wii-like motion control, where the motion of the phone determines the on-screen control. And it also works like Kinect. By using the phone's camera, you can do in-the-air gestures over the phone, which controls on-screen action.

Asus introduced a product called the Asus Qube, which is a TV box for streaming Amazon and Netflix content to a TV. The Qube will reportedly support both gesture and voice command.

Marvell announced a new platform for voice and gesture control of smart TVs, gaming systems and home automation systems.

European telecom Orange and a company called Movea are working together on a set-top box controlled by in-the-air gestures. The difference with this product, called the SmartMotion Server, is that it's got a remote control unit that operates like a Wii remote -- it's the motion of the remote in your hand, not your hand itself, that controls the TV.

LG demonstrated a similar concept -- with an interesting twist. The LG Magic Remote lets you change to a specific channel by writing the number in the air while holding the remote. The remote also accepts voice commands.

Even car makers are looking at motion control. Demonstrating at CES and the upcoming Detroit Auto Show, companies like Hyundai are rolling out concept cars that enable drivers to control the audio system by waving their hands.

And Intel is already working on the next generation of gesture control with its Perceptual Computing initiative. One feature allows you to control an interface by simply looking at it. The system tracks your gaze, and gives you more of whatever you're looking at.

Time to throw our hands in the air and give up?

All of these in-the-air gesture solutions are great in isolation. But when everything in our lives -- our computers, phones, tablets, TVs, cars and plumbing -- all get gesture control, and when they all use different gestures to do similar things, it's going to be frustrating, annoying and off-putting.

It's not just that the same action has different gestures on different platforms, the opposite is also true: The same gesture will do different things on different platforms. For example, closing your fist will change the channel on one system, grab a window on another, and mute the sound on yet another. One gesture, three different actions.

Anyone familiar with one system will be frustrated and confused when their muscle-memory makes the device they're using do unexpected actions.

What we really need is for the big players in the industry -- say, Microsoft, Samsung, Google, Apple and other heavyweights -- to get together in a standards body like they do with technologies like USB and Bluetooth and agree on standard hand gestures and how they work.

They probably won't do it -- each major player will try to patent their gestures and prevent others from doing it. But getting organized and coming to agreement for the sake of the users and the technology sure would be a nice gesture.

Mike Elgan writes about technology and tech culture. Contact and learn more about Mike at http://Google.me/+MikeElgan.

Copyright 2015 IDG Communications. ABN 14 001 592 650. All rights reserved.
Reproduction in whole or in part in any form or medium without express written permission of IDG Communications is prohibited.