In a few short years, the technologies found in today's mobile
devices—touch screens, gyroscopes, and voice-control software, to name a
few—have radically transformed how we access computers. To glimpse what
new ideas might have a similar impact in the next few years, you need
only to have walked into the Marriott Hotel in Cambridge, Massachusetts,
this week. There, researchers from around the world demonstrated new
ideas for computer interaction at the ACM Symposium on User Interface Software and Technology.
Many were focused on taking mobile devices in directions that today
feel strange and new but could before long be as normal as swiping the
screen of an iPhone or Android device.

"We see new hardware, like devices activated by tongue movement or
muscle-flexing, or prototypes that build on technology we already have
in our hands, like Kinect, Wii, or the sensors built into existing
phones," said Rob Miller, a professor at MIT's Computer Science and Artificial Intelligence Lab (CSAIL) and the chair of the conference.

One of the most eye-catching, and potentially promising, ideas that
was on show makes it possible to perform complex tasks with a flick of
the wrist or a snap of the fingers.