Bacon. Impossible to eat while working on a touchscreen. It’s an awful, AWFUL problem that needs to be dealt with. Ya know, we’ve toyed with the idea of touchscreen interfaces for interaction with CAD data, but it’s always been a bit of a transitional solution to working in 3D. I don’t know why grabbing loose molecules of air seems a more convincing way to interact with 3D geometry, but it does and 3Gear Systems is making it happen with Kinect-powered gestures and a new development kit to push it into the 3D software.

3Gear Adds gestures to your applications

3Gear has achieved $350,000 in funding from K9 Ventures (Manu Kumar), Aditya Agarwal, Uj Ventures (Eric Chen), Safa Rashtchy, and Naval Ravikant. The three man team is based out of San Francisco, made up of co-founders Robert Wang and Chris Twigg with founding engineer Kenrick Kin. They’re directing the 3Gear Gesture Control SDK (available here) at developers in the CAD, Medical and Gaming industries. In fact, they’re very much leading the charge by presenting the application of handling your 3D design data.

What’s the inspiration behind this new Gesture UI?
We want to capture the full expressiveness of your hand. The mouse is a 2D input device that treats your hand as if it’s one big pointing finger. The touchscreen lets you use two fingers to slide around pictures under glass. We’re creating technology that captures your entire hand, and lets you grab things, turns things over, assemble things, animate things, etc–all in a comfortable and ergonomic way.

What are the backgrounds on the 3gear team?
All three of us come from computer science and human-computer interaction backgrounds. Rob did his PhD at MIT on tracking colorful things using computer vision. Chris got his PhD from Carnegie Mellon before working at Industrial Light & Magic R&D on visual effects. Kenrick got his PhD from Cal, but spent much of his time in grad school at Pixar Animation Studios, inventing ways to use multitouch screens to build rich 3D environments for computer-animated films.

What type of features and functions can people using 3d solid modeling software expect to see?
Our goal for something like SolidWorks is to let you “mate” parts by assembling them with your hands. Also, you should be able to spin a part or an assembly as if you’re holding it in your hands. We still have to get a lot of stuff right to get there, but releasing our technology via this software development kit (SDK) is our first major step.

What modeling/rendering software will be supported?
Since we’re just three people right now, that’s going to depend on the developer community. We’d love to see this in solid modeling packages like SolidWorks and Inventor as well as surface modelers like SketchUp and Maya.

How do you see gesture UI’s working with a mobile workforce?
Our current physical setup is a little clunky, but we expect our future hardware to be little much more than a clip-on for your monitor or laptop. We want gestural user interfaces to be available on the go too.

The developer kit is available to download immediately on the 3gear website. You can also find out about the “finger-precise hand-tracking” technology and how they’re making it happen here.