I desperately want it to work, don't you? Just like Minority Report. You wave your hands and your computer interface moves effortlessly.

Frankly, let's forget all that. I'll lower my expectations WAY WAY WAY down. I'd just like to wave my hand left and right and the system move a window between one of my three monitors? Seems reasonable.

Really? I find them both equally bad. 1/100th of a millimeter? That's lovely but it makes for an extremely hyperbolic and twitchy experience. I have no doubt it's super accurate. I have no doubt that it can see the baby hairs on my pinky finger - I get it, it's sensitive. However, it's apparently so sensitive that the software and applications that have been written for it don't know how to tell what's a gesture and what's a normal twitch.

My gut says that this is a software and SDK maturity thing and that the Leap Motion folks know this. In the two weeks I've had this device it's updated the software AND device firmware at LEAST three times. This is a good thing.

Perhaps we need to wear gloves with dots on them like Tom Cruise here. When you hold your fingers together and thumb in, Leap Motion sees one giant finger. Digits appear and disappear so you are told to keep your fingers spread out if you can. This becomes a problem if your palm is turned perpendicular to the device. Since Leap Motion only sees up from its position on your desk, it can't exactly tell the difference between a palm down with fingers in and a hand on its side. It tries, but it's about 80% by my reckoning. That may sound great, except when it's 20% completely insane.

I also found that wearing my watch confused the device into thinking I had a third hand. I'm not sure if it's glints off the metal of the watch, but I had to take it off.

To be really clear, I totally respect the engineering here and I have no doubt these folks are smarter than all of us. Sure, it's super cool to wave your hand above a Leap Motion and go "whoa, that's my hand." But that's the most fun you'll have with a Leap Motion, today.

There is an excellent diagnostics system that will even warn you of fingerprints. You'll be impressed too, the first time you get a "smudge detected" warning.

The software is impressive and organized, but on the down side, the Leap Motion Service takes up as much as 6-7% of my CPU when it seems something near it. That's a lot of overhead, in my opinion.

The software that I WANT to work is called "Touchless for Windows." It's launched from the AirSpace store. This Leap Motion specific store collects all the apps that use the Leap Motion.

Having a store was a particularly inspired move on their part. Rather than having to hunt around the web for Leap Motion compatible apps, they are just all in the their "store."

The TouchLess app bisects the space above the Leap Motion such that if you're in front of the device you've moving the mouse and if you've moved through the invisible plane then you're touching the "screen." Pointing and clicking is a challenge to say the least.

Scrolling on the other hand is pretty cool and it's amazing when it works. You move your hand in a kind of forward to backward circle, paging up through web sites.

It's not foolproof by any means. Sometimes the Leap Motion will go into what it calls "robust mode." I am not sure why the device wouldn't want to be "robust" all the time. It seems that this really means is "degraded mode." There are threads on the Leap Motion forums about Robust Mode. Lighting seems to play a large factor.

Here's me attempting to use the Leap Motion with Touchless to do anythingto this folder. Open it, move it, select it, anything.

Today, I look at the Leap Motion as an amazing $80 box of potential. Just like the Kinect, the initial outcropping of apps are mostly just technology demos. It remains to be seen if the Leap Motion will mature in the coming months. I still think it's an amazing gadget and if you have $80 to blow, go for it. Set your expectations low and you won't be disappointed.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

Nice writeup, as usual! If you haven't checked it out yet, you might want to have a look at the SDK. The folks at Leap provide excellent C# bindings that are compatible with both .NET and Mono. I was really happy to find that it worked with Xamarin.Mac right out of the box. I put together a few interesting demos with Leap Motion and Quartz Composer that I'll probably publish soon. I agree that the technology is not quite where it needs to be yet, but it's really fun to hack on.

Ryan Paul

Wednesday, August 14, 2013 6:09:10 AM UTC

Nope - you're post is spot on from a consumer point-of-view... if you expected this to work perfectly and be an alternative to buying a touch-display you'd be in for some disappointment...

Right now I view this as a hacker toy, part of a larger stack for something better (ie. mashing into Rift), but shouldn't be sold in a retail store for mums and dads yet!

mottey

Wednesday, August 14, 2013 6:31:58 AM UTC

They say if you build it, they will come. People never truly thought of working with computers in 3 dimensions because such a system didn't exist or wasn't practical one way or another. But with Leap Motion, people can finally START to imagine ways in which an extra dimension can be useful in computing and perhaps other applications, because there is now an outlet for their imagination. Something concrete for once.

The Oculus Rift is succeeding for the same reason. That being that it's actually something real. Something you can hold.

Sans Sense

Wednesday, August 14, 2013 6:35:58 AM UTC

Mottey - Well said. Similar to Kinect on PC it has massive potential and it's well priced. It just isn't useful as a consumer device. I am still going to try to write the "flick my wrist and move a window" app. ;)

@mottey I was just about to return the leap unopened due to reviews. Then I saw your post and it got me google-ing (OK Binging for Scott)

Keeping it now and waiting on my Oculus Dev kit. I thank you and I am sure my wife will curse you :)

Chip Burris

Wednesday, August 14, 2013 7:55:01 AM UTC

I had a beta device and now the actual release. For me, a lot of the apps won't even run, having tried them on different PCs both at home and at work. Touchless, sadly, is one of those. Others will run successfully on one machine but not on another. I reported the issue but unfortunately haven't had time to do any serious investigation myself.

Google Earth has worked everywhere and is a cool demo of what's possible, but definitely early days.

Whole Leap Motion and Oculus Rift reminds me Jurassic Park, remember when that kid surrounded by raptors used whole afternoon to click one thing in 3d UI. None of this stuff is actually anything new. Infact some of these are worse than stuff made in 80s. If you go back to attic and find some old tech magazines you can find all these stuff presented as future in 80s, 90s and begin of 2000. It's not technological problem, it's a UX design problem.

Kuukuna

Wednesday, August 14, 2013 8:09:40 AM UTC

Early days. Its painful to use Leap Motion because its essentially a 3D device being used currently for 2D work. It might really fly with 3D applications: say you are wearing NVIDIA 3D Vision glasses and are rotating/manipulating an rendered object (say architectural building) floating in front of you by using your hands.

I think that hand waving only has a long way to go yet. What I (like Scott) want to is kinda drag stuff around. I wonder how a "hybrid mode" would work - e.g. holding alt while using the leap would allow dragging of the current focused item. This would allow for nice causal web browsing

Paul

Wednesday, August 14, 2013 8:42:25 AM UTC

@Kuukuna You're right in that the concepts are not new - but thankfully now the technical problems are being solved by the community and abstracted for people like me, the average UX designer.

UX/non-tech peeps with these atoms/SDK can now take on the challenge of solving, experimenting and playing with new experiences - which I've been dreaming of since I was a kid.

Sure, maybe the Leap is buggy and will only work 80% of the time, maybe it will fail - but I can play, code and touch it now. Amazing.

mottey

Wednesday, August 14, 2013 8:55:09 AM UTC

When preordering Leap Motion couple of months ago my hopes for it where really high. When I finally got it I was, like Scott, a bit disappointed. Wow factor wears off quickly and if you don't write apps for it, it is basically unusable for now.But I urge you to check out this new app GameWave. It is like TouchLess app on steroids and highly configurable. As usual you have to get used to it, but once you do it's really cool.I'm sure in upcoming months there will be more interesting apps that will make Leap Motion very usable on daily basis.

Lucas Kaczanowski

Wednesday, August 14, 2013 9:16:27 AM UTC

Thanks for saving me $80!

Mark Allett

Wednesday, August 14, 2013 9:58:58 AM UTC

That's lovely but it makes for an extremely hyperbolic and spastic experience.

Hi Scott, I understand the word has different meaning in the US, but to UK readers, the term "spastic" is an offensive term for people with Cerebral Palsy and disabled people in general. I know you wouldn't want to offend anyone so just thought you should know.

David

Wednesday, August 14, 2013 10:04:13 AM UTC

David - Indeed, I meant it in the regular "involuntary jerky motions" sense of the word. It's a harmless word in the US. No offense was intended. I'll edit the post. Thanks for the heads up!

Hi Scott. I've had the Developer Unit for a while so I have had more time interacting with the device. I was also initially excited when I got my developer unit but after a while the excitement wore off. If those who got consumer units think they have it rough then they have no idea. I have to say that the Leap software has matured over time and the tracking has really improved.

My current view of the device however is that it is not the most ergonomic path to interacting with your PC/Mac. I find that any app that requires you to hold your hand/s up for prolonged periods of time is nothing more than an energy drain. The cool factor is merely visual but in the actual sense it doesn't add value to your interactive experience. If there is one thing the LEAP does well it's that it makes you appreciate your mouse more. One of the most painful things I've seen when it comes to user interaction is when a friend of mine tried using it for a while. He was fatigued and overwhelmed.

I think the word 'potential' is being thrown around in an effort of folks to validate their purchase. The honest thing is that they possess a novelty item. They were drawn in by the marketing which was just that, marketing. It's a psychological effect which keeps them grasping on the 'potential' of the device.

MK

Wednesday, August 14, 2013 11:24:53 AM UTC

This device is amazing and the SDK has so much potential, just google "Leap Video Database" for some examples. Not better than the Kinect? No bias coming from an Microsoft employee huh? Or are you playing it down so that you can play up the new Kinect coming out later in the year :p

Ravi

Wednesday, August 14, 2013 11:54:02 AM UTC

I think I could give a better response if the bloody thing would even work for me. I've had it for about 5 days and it won't work on my Mac. The support has been very responsive but it just doesn't work for me. Somewhere along the line it's probably my fault but my boredom threshold is sufficiently low that I may not bother to get through it now.

Another shiny gadget for the dusty drawer of doomed electronics I think...

Gav

Wednesday, August 14, 2013 12:00:20 PM UTC

I guess the trick will be having 3D sensors on this kind of devices, if it only tracks a 2D plane to capture an "object" (hand) that exists and moves in 3D it cannot work very well. Still a step forward.

I'm not really sure about the potential. The ergonomics is just awful. 15 minutes of usage and you realize what a great invention the mouse actually is. First few days I had it at the office, assigned some different moves to interact with chrome, osx, etc. I got tired of it because i didn't really solve a problem for me.

Now though I do use it, but only to control the TV (turn off sound, pause/play etc.) So on the upside, no more greasy remotes :)

Well said, it has no functional value in its current state. Looking forward to see what happens over the next few months.

Frode Rosand

Wednesday, August 14, 2013 1:35:11 PM UTC

I started with a developer device, intent on creating a hand-controlled Trello board. Like some others commenting, I came to question the merit of controlling your PC this way. Even with really good hardware and software, is it easier to hold your hand up in the air and wave, or is it easier to rest on a mouse and move slightly? I could be totally wrong, and probably am totally lazy, but I found my arm getting tired after a bit.

That said, you should check out the SDK, if only to learn what awesome code it really is. I focused on their javascript API, which uses websockets to connect to the device driver running on the PC. Think about that: You can write _javascript_ that runs anywhere and talks to this thing. You could Leap-enable your blog! ;) It's totally rad use of websockets and fun to do. With just a few LOC I got together a little menu: hold up 1 finger for item 1, 2 for 2, etc. To your comment about developers handling sensitivity, I did find for that I had to 'debounce' a bit: I had to check that the user held 2 fingers up for a few milliseconds or else it could randomly pick a menu item.

I will also say, I feel they did a really good job managing the development process. They had regular releases and communicated well what was in each, and what was coming. I was left thinking "this is the way to run this type of project, I just hope it works out for them."

Off-topic: "Spastic" is the word to describe jerky uncontrolled movements. Yes some object because they suppose those with cerebral palsy or other diseases causing spasticity will be somehow demeaned by the use of that word.

I used to do some volunteer work for the Spastic Society before it became SCOPE, FWIW. Calling someone a spastic is not good IMO, categorising them by their infirmity. But objecting to describing movements as spastic is going too far.

I've been using the Leap to experiment with some WPF applications, and I find myself having the exact same problems that everyone else has... Main gesture recongition and click. The click is horrible, I feel carple tunnel coming on every time I spend 30 seconds trying to make a click.

The Kinect on the other hand fails on the extreme of the leap, because the current Kinect cannot detect fingers it becomes somewhat equally useless. There needs to be a happy medium between finger detection, range, and accurate gesture recognition before the whole of this technology will become useful.

Scott Lance

Wednesday, August 14, 2013 3:28:37 PM UTC

People don't give Kinect enough credit. The skeletal recognition is absolutely amazing - it has worked well since Day 1.

Also Kinect 2 will recognize fingers. And I don't see why not, a Kinect Next can't be built in a laptop.

That being said, Leap provides an opportunity for a cool photo or a video, but a trackpad or a mouse provides hand-rest.

Boris Yankov

Wednesday, August 14, 2013 3:48:24 PM UTC

technology for technology's sake? The overprecise "twitchy" sensitivity of this device mimics my frustration with the Wii controller + netflix. Startup goes something like this: turn on the Wii, then wave the controller around in large but decreasing semicircles until I find just-the-right-spot to get the cursor onto the screen. Technology gives me thirty seconds of hand-eye coordination tests and requires precise sub-degree angle movements; all I really want is a physical "just play the damn movie" button.

jack

Wednesday, August 14, 2013 3:51:27 PM UTC

I think anyone that doesn't understand the potential here is being shortsighted. This is the future, you're just not comprehending it. The leap is a first step. Next, anything is possible. As far as uses, this will be great for tradeshow booths and will replace touch screens in that setting.

Kane

Wednesday, August 14, 2013 3:54:24 PM UTC

There's a lot to be done with LeapMotion. I have a Core i5-2550K and the response time and accuracy are to be desired. The Windows Touch app sucks horribly...I paid for $80 for something I'm not even using at the moment which is disappointing might I add. I may have to give it another whirl one of these weeks but for now, it's fine where it is.

Brian M.

Wednesday, August 14, 2013 4:25:35 PM UTC

Sadly, I bought one and came to the same conclusion. I returned it. The CPU utilization on Mac is horrendous. It requires a Core i chip (tested on both a Core 2 Duo and a Core i7, the 2 Duo was unusable) to even work.

I really really have high hopes for the next revision, especially with the influx of funding they received.

The killer app/game for this would be Surgeon Simulator, Oculus Rift, and the Leap Motion all working together.

Every newborn baby is useless, too. We don't value them because of their immediate utility, but because of their promise and potential, and this case is no different.

Alex

Wednesday, August 14, 2013 4:56:03 PM UTC

Hi Scott,

same feeling. Great device, great potential, right price, good sdk but ... where's the deal ? mouse replacement ? new gesture (my shoulder hurts :) ) or some mix of leap / kinect / mouse that now simply doesn't work or no one yet imagined !

BTW, now it's quite useless, same experience of playing tennis with wiimote (you can swing your wirst without acting as Agassi and breaking the lamps), without the fun of playing tennis on wii with some friends ! :) yust to troll and bully some friends on this new toy ...

Another good one Scott.

Gianluca

gianluca gravina

Wednesday, August 14, 2013 5:47:44 PM UTC

Camera based solutions are limited, the gestural and muscle tracking of something like MYO Armband look better suited for Minority Report style interaction.

mottey

Wednesday, August 14, 2013 6:11:55 PM UTC

Alex - LOL, but I don't pay $80 for a commercial newborn baby at Best Buy. Typically when something comes to market you expect it to work. The Leap Motion is amazing tech, clearly, but as a commercial product, it IS a baby.

Kane - Read the post again. I'm VERY aware of the potential. I just think it's not living up to it yet. The Kinect is the same way. Where's my Kinect for Windows app? At least the Leap Motion has Touchless...the Kinect doesn't even have that much. The thing is potential quickly becomes wasted potential.

All this Sci Fi gesturing looks very tiring if you're an office worker of the future; long live the mouse and minimal physical activity :)

Russ

Wednesday, August 14, 2013 7:22:00 PM UTC

Hi Scott, Intel is driving their Perceptual Computing initiative that give you your "Kinect for Windows" app to a certain level. The combination 3D+RGB+audio camera mounted on your monitor/laptop can detect gestures, poses, voice, eyes, head, and more. The time-of-flight depth technology is true 3D, not edge-detection like Leap, and works from 15cm to 1m. So, interactive apps that don't require you to hold your hands in the air can (and are) being developed. Check it out.

Tim Droz

Wednesday, August 14, 2013 7:59:27 PM UTC

I don't know about its future as a consumer PC input device, but I can tell you from first-hand experience that it makes a PHENOMENAL dj controller. I've incorporated it seamlessly into my routine and it's a fantastic (and FUN!) way of controlling the music, FX, and filters.

And that's after only playing with it for a few minutes.

So, I think "useless" is a bit much. This may be a niche area of course, and I get that you acknowledge its potential, but I still think your title is a bit more than unfair.

I would also echo Daniels comment... after a few minutes of usage it is clear that this is no way going to be an all day device. Humans will not be able to hold their hands up all day without their arms tiring out quickly. They're conditioned to having their arms lay on a flat surface to type. Still trying it out though.

Josh - Can you expand and give some more detail on how the DJ controller works? How gross are the motor movements? Is it just needing to noticed your arm moving from one side to the other, or is it 1/100 of a millimeter of a single finger? Not a troll, but I want to understand if what you're doing could have been done with a webcam.

Bill Gates says tablets will be the future. He has his company creates a tablet and pushes it to the market. It sucks and no one buys it.Ten years later, we all have tablets.Kinect and LeapMotion are the first tablets. Give it time.

Yep. That pretty much summed up my experience with it. It's definitely one of those 'if I could just think of something really cool to do with it that isn't just another gimmicky game' gadgets. I keep thinking 3D controller - but then I look at it in test mode and wow - it's going to be hard to handle if it's constantly losing finger tracks.

It's kind of like the Sphero - it's a really cool idea - then you play with it for a few hours (or minutes) and you put it away.

Sad really...

The Werewolf

Thursday, August 15, 2013 12:08:06 AM UTC

Imagine this as a controller for a manual for a mechanic with greasy/dirty hands...

Phil

Thursday, August 15, 2013 12:42:41 AM UTC

Hey man-

I know their investor well and was lucky enough to play around with one for a few hours several months ago so I am biased. That said, you need, need to check this out, I will be personally offended if you don't. :)

https://airspace.leapmotion.com/apps/bettertouchtool/osx

While I agree it will take some time (1-2 years?) to reach their potential, its the only way to make people think and move forward with interfaces.

at best it's a cheap game controller prototype. at worst it's a gimmicky fraud relative to the hype. There is no killer app. Molding clay without having to use a keyboard?

OK... but like how many people out there mold clay with a keyboard and experience this singular painpoint. Oh, I've tried better touch tool etc... painful gorilla arm experience. My Leap is collected dust on the shelf.

Subfinity

Thursday, August 15, 2013 4:33:11 AM UTC

Any device that makes me less productive and is awkward to use is useless and it's just a fad or geeky stuff. And I don't have to buy it to come to this conclusion. There's nothing out there that can beat the use of a keyboard and amouse which were invented decades ago. I don't see any time soon an input which will replace them. I need the speed of text input and the pixel accuracy of a mouse. Good luck to anyone who thinks waving with his hands is productive.

Abdu

Thursday, August 15, 2013 5:36:37 AM UTC

For the most part I agree with your review, unfortunately. Touchless for Windows was hugely frustrating to use.

The (useless) app I enjoyed most is actually Fruit Ninja. Because it was designed to not need any "clicking", I find it much easier to control accurately than say "Cut the Rope". Compared to most other Airspace apps, I was pleasantly surprised.

I totally agree with your assessment. I too would love for the Touchless app to work better. Right now I feel it is to hard to accurately hit what you are trying to manipulate on screen. I was actually hoping that you would be able to calibrate it to your monitor and then use your monitor as a touch screen. Which would make hitting things accurately much easier. Similar to this YouTube video http://www.youtube.com/watch?v=Dyb_KD9MiLM. I really want this to be a great new way to use your computer, but right now it is just not there.

Davin Studer

Thursday, August 15, 2013 12:26:02 PM UTC

People are used to move small muscles to achieve task. Using keyboard/mouse/tablet-screen consumes small finger muscles, while Tom Cruise demo consumes shoulder, back, arms muscles. Even if it looks super cool, I don't imagine people using their shoulders for a common task that can be achieved by using their fingers. Sans Sense underlines it ios a matter of productivity, but it is also a matter of laziness.

So I am pretty pessimist concerning leap motion to replace keyboard/mouse/tablet-screen. Taking account of this laziness trend, the next thing could be moving cursor and writing with mouth, eyes and mind. Concerning mouth, with project like Siri it becomes more and more a reality. Concerning eye and mind the technology already exists for handicapped people, but it looks like it needs to get maturity.

I pre-ordered two because I wanted to encourage this sort of thing and I wanted to build something with it. One of them is still in its box, untouched. It's all because of the lack of precision. As you say, even getting that much precision is probably a great achievement, but it's not enough. I too have hands and digits flickering in and out of detection and that's not ready for prime time considering it's a device whose sole purpose is detecting hands and digits.

What they'll need to do is get the current detection reliable and then work on adding more gestures. A few months back, I figured out what I'd have mine for; snap your fingers to compile. (There's more in the same vein, but in essence, that sort of stuff.) That can't be done because the gesture can't reliably detected. I can't even get the hand and digits to stay recognized for the duration of the gesture, not once. And yeah, I too get the "robust" mode, because the environment I'm using it in is a fourth as bright as in their own presentation video and apparently, the Leap is now suddenly a vampire.

All this said, I'm reasonably impressed with the way they did "web integration" without plugins; a part of the driver that runs a WebSocket server on a given port to which anything may listen, and for which they have a JavaScript library.

After a week or 2 of fiddling with this thing for O/S control (GameWave looked promising, as does the opensource AirKey project) I've come to a start realisation:

Forget all the hoohah about "O/S's needing native gesture support" yaddayadda yadda - Gesture interaction is as complex and personal as speech recognition. It's the lexicon that's missing. We need to teach a gesture lexicon to the system like the early speech recognition, before the recognition engines mature.

Expecting human dev's to nail down complex, intricate gesture control and then for the world to learn these complexities is folly. It's the planned economy all over again.

More that that, if I'm sitting in a slightly different position, the geometry of my arm>hand movements changes. The system used needs to recognise the abstract gesture regardless of orientation (the leap working from below makes that a little tricky) - coding that by hand is ridiculously complicated. Ask 100,000 users to train the system in a gesture they want to perform a common action and you'll be heading in the same direction as speech recognition, which is getting pretty damned good.

AFAIK, Leap have no capability or plans to crowd source (from owners in the wild) training some AI (probly neural nets with heuristics for context/combo moves) for their gesture recognition. < do that please!

To be honest, when it comes to crowd-sourced training of AI for useful functions, ReCapture (used by Google), Google image identification and a few other Google projects have really shown what can be done. Come on Microsoft, Leap, Apple - get on board!

I learned 'em good

Monday, August 19, 2013 12:53:37 PM UTC

I think the challenges that you mentioned are more due to the limited out-of-the-box software and "gimmicky" approach many have taken to the device so far. There's an open source app called AirKey that allows you to use the device easily (it's not perfect but brings the device into real-life). I made a short video of common actions I use every day:

As others have mentioned, what frequently gets forgotten in these UI discussions is the most important- can I use the device and UI comfortably for long periods of time? So far we know this works with keyboards and mice and similar devices and UIs, but this definitely doesn't work for any kind of touch-based device and UI that requires Minority Report-type movement. I mean, when I'm using my Kindle HD in bed my hand and arm get tired just from swiping if I'm browsing the web.

I an developing an app for the Leap Motion called AirControl which will allow you to control your mac. It's still a prototype but it's getting better every day.Here is a demo video on YouTubeHere is a demo video on YouTube

Joseph

Thursday, August 22, 2013 1:13:58 AM UTC

@Josh Thanks for the video I checked it out. I am wondering though what it is that you're doing with the leap motion. Are you actually creating music on the go? To me it looks like you're raising/lowering the volume of a track? Maybe some notes within the video to explain what it is you're doing with the leap motion would be helpful.

Chris Lee

Saturday, August 24, 2013 12:49:25 AM UTC

We are a group of technology enthusiastic students based in Toronto, Canada.

We have developed a computer-vision product called Flowton Controller - combining the beautiful industrial design, gesture control, and voice recognition, aiming to bring the first NI (natural interaction) device to the modern homes.

We have just launched our kickstarter campaign with our working prototype and design vision, we appreciate anyone to check out and let us know what you think. Thanks for the support!

The reason you want it to work like Minority Report and get Arm-twerking is because Leap got it wrong RE gestures versus free-form sensing of hand movement. The Oblong system that is based on Minority report understands the importance of gestures. As such, it works VERY well at controlling things in 3D space (this is from direct experience, not from watching a TED Talk). Imprecise "whatever you feel might work" control will only be properly interpreted by a machine when machines become truly intelligent and probably can read your thoughts as well. The lack of full-body engagement also doesn't help much with the device's usability. If I need to sit down and control the screen, I'll use a cap touch screen. If I need 3D control, I'll use a space-mouse.

Because they did not pay adequate attention to precision of mapping commands, and got overly concerned with precision of the hardware itself without understanding a vocabulary they could map that precision to, the device is a toy or at best a performance art piece, like a theremin...quirky and neat to see, but limited to be used by DEVO or whatever.

Even the simplest movements often fail with leap, as opposed to the stability afforded by g-speak. To some extent that is to be expected because g-speak is (many $$$) versus $80, but in the end even $80 is a waste when the thing doesn't work.

Me Grog

Saturday, September 07, 2013 4:26:06 PM UTC

hey.,guys .,.i just downloaded flutter,,n i think that its super cool,well it needs to get lot more updated,anyways you guys know any other gesture recog softwares available for free download.,???!!.,.it'd be great,if you give some suggestions.

gypsyDANGER

Monday, September 09, 2013 3:25:35 PM UTC

I've got a lightly used one - for sale CHEAP CHEAP CHEAP.

Heck, cancel that. I decided to throw it away. I'd feel guilty taking someone's hard-earned money for this useless device (at any price). What a disappointment after all that publicity.

Hollis Tibbetts

Tuesday, September 10, 2013 10:36:23 AM UTC

I guess in some years you will count to the folks who said that computers will not fit in peoples live… The leap is a great thing and even if it's really hard to handle (what it is, I'm with you in this point)… it's V1.0 just give it some time and to be honest: 80$ ? What was the price of a kinect, or still is ? I guess the price makes it spread and this is the potential of the leap, and to be open source based from day one. And so every programmer on this planet could - more or less - afford it. And this -serious- point goes to the leap instead to the kinect-

B

Wednesday, September 18, 2013 8:15:58 PM UTC

Lol, yep! Just did a comparison between the Leap Motion intro video and the real experience http://bit.ly/18bcyaf. At the moment, it's totally rubbish from a consumer's point of view, but it's a proof of concept which is very exciting for future, especially with the Oculus Rift as mottey pointed out.

Hey Scott,As you said the product does have its rough edges maybe the fact that they marketed a very high precision artifact, that makes you think its going to be an amazing experience.But there a couple problems with that, with high precision comes a lot of data, and there for you get the problems you described, where the app now has to distinguish between your finger twitch and a actual action.The solution is not present yet, but what we need are learning algorithms that allow certain things to be taught to the application in a way that you can distinguish easily (fast) what the user is doing and throw away the non important things. After we have that,then gestures will be created by devs and then certain gestures can be incorporated to device and make it common between apps. We don't actually know whats common or not, what all cultures think of that, so how do you create a device and put all those things inside it, if you don't even know what they are. Unless you impose it. Ex. How do move a windows from one screen to to the other. You flick it? You point and click? You grab and release? Even simple things like turning a page can be bad: lets say you are on a Windows 8 machine, reading a book. And lets assume the technology is fully supported in windows (Driver like a mouse), the first problem, is how you distinguish from a Right to Left in Windows to get the charms and "in app" to turn the page? Or what if my language is the other way around (left to right), or a book vs a notebook (Bottom Up), now you will get the app bar instead.Ideally people build things, early adopter buy them and set standards, then much later it becomes something real and usable by all, but it has to go to market first.

Mateus

Tuesday, November 12, 2013 1:15:40 PM UTC

I have been playing with one for a couple of weeks and....I love the thought of Leap but the reality is it's pretty useless right now. They have released too early before it was ready - most likely due to pressure from investors who have now shot themselves in both feet.

What it actually needs is another detector working from above so that you immerse your hands into a 3d "mesh" thereby avoiding the fingers disappearing when you turn your hand.

However that it quite a level of complexity up from where it is now but until it works it'll be a flop. Quite sad about that.

Bob McGrath

Comments are closed.

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.