Gesture interfaces seem inevitable, but who thought we could get there so quickly?

It’s so rare that technology feels like magic. That first time you sifted through thousands of songs in your pocket or used a touch-screen phone? Forgotten now. In using technology in our daily lives, the impossible is proven, again and again, to be entirely possible, and we grow numb to the magic around us.

advertisement

I only mention this because the Leap certainly looks like one of those magical moments in technology. About the size of a pack of gum, the device sits on your desk to read your gestures with 1/100 millimeter accuracy–or what the company claims is 200x more sensitive than anything on the market today. Powered by IR (and some breakthrough algorithms), it’s essentially a Kinect that fits in your pocket, emitting an invisible field of 8 cubic feet to read all 10 fingers of your gestures. And they promise it’s every bit as fast as the above video makes it appear.

The company’s use case metaphor? While a mouse might allow you to draw, the Leap enables you to shape clay.

While a mouse might allow you to draw, the Leap enables you to shape clay.

To be honest, every bit of the story sounds too good to be true: From the CTO David Holz, who is just 23 (but has consulted for NASA) to the ridiculously low $70 asking price (on preorder now for release in early 2013). And it probably would be, if the Leap weren’t already subjected to live demos by the press or the scrutiny of VC firms. Leap isn’t just a Kickstarter dream; it’s a working device that’s raised over $14 million in funding. At least a chunk of that is being invested into external development, as Leap will be giving away between 10,000 and 20,000 development kits to ensure programmers are creating an ecosystem for the device. It’s a good sign that the company is already looking beyond their intriguing novelty factor and into how Leap can carve its place in our mass digital consciousness.

There’s a lot unsaid about how user ergonomics will respond to holding our arms in the air and gesturing all day, and the fact that we still can’t touch the objects we’re manipulating. But when I look at Leap, I don’t think it has to be my primary input device to be groundbreaking. It’s small and cheap enough that it can supplement the other things that I do–and it could theoretically fit into a phone as well as a laptop.

And already, I can’t help but wonder, what could happen if we aimed Leap at our faces rather than our hands?