Engadget RSS Feedhttps://www.engadget.com/tag/leapmotion/rss.xml
https://www.blogsmithmedia.com/www.engadget.com/media/feedlogo.gif?cachebust=trueEngadget RSS Feedhttps://www.engadget.com/tag/leapmotion/rss.xml
en-usEngadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronicsCopyright 2018 AOL Inc. The contents of this feed are available for non-commercial use only.https://www.engadget.com/2016/12/05/leap-motion-will-bring-your-hands-into-mobile-vr/https://www.engadget.com/2016/12/05/leap-motion-will-bring-your-hands-into-mobile-vr/https://www.engadget.com/2016/12/05/leap-motion-will-bring-your-hands-into-mobile-vr/#comments
Leap Motion has been working on making your interactions in VR as realisticas possible, but it's only been available to desktop or console systems. Now, the company has expanded its scope to mobile devices with its new Mobile Platform, designed for "untethered, battery-powered virtual and augmented reality devices." It has built a reference system of its new sensor and platform on top of a Gear VR, that it says it is shipping to headset makers around the world. Leap Motion is also bringing demos of its Interaction Engine (for natural hand gestures) in this portable medium to major VR events this month.

Virtual reality is more immersive when you can pick up objects with your bare hands, rather than a controller or a pair of wand-style remotes. Leap Motion is one of the frontrunners in this area, having pivoted its candy bar motion-tracking sensor from desktop accessory to VR headset companion. To raise interest in the product -- which you still have to attach manually to an Oculus Rift or HTC Vive -- it's developed a new piece of software called the "Interaction Engine." Available as an add-on for Unity, it promises a more realistic experience while interacting with make-believe objects.

Razer unveiled its latest revision to its Open Source Virtual Reality (OSVR) developer's bundle (aka Hacker Development Kit v1.4) during GDC on Tuesday. The latest HDK offers a number of improvements over its previous iteration.

For several years now, Leap Motion has been working on bringing hand gestures to virtual reality. And it makes sense; using your hands to move digital objects is way more natural than fiddling with a controller. But to do this, you needed to strap one of the company's motion sensor peripherals in front of an existing VR headset, which is a little clunky to say the least. Plus, the sensor was still running the same software built for desktop PCs; a holdover from the days when Leap Motion's main focus was the aforementioned PC accessory. Now, however, the company is ready to take the next leap forward. Today it's announcing Orion, a brand new hardware and software solution that's built just for VR.

As immersive as virtual reality can be, you're still left holding awkwardly shaped controllers in your hands -- a reminder that you're very much in the real world. But what if instead of hitting A to move a rock, you could just use your hands? That's exactly what Leap Motion, known for its hand-gesture control sensors, has been working on for the past few years. Most recently, it's been working on something called the Interaction Engine, which aims to take things a step further: to make picking up objects in the digital world feel as natural as it does in the real one.

The holy grail of virtual reality is presence; that feeling that you're truly there in that virtual world. That's why it was great when Leap Motion announced that it would be making a VR mount a few months ago -- manipulating objects with your hands is just so much more immersive than using a game controller. But in order to use it, you would already need to have a VR headset plus you'd need to get the Leap Motion sensor separately, which isn't exactly ideal. Now, however, you don't need to: Leap Motion has just announced that it's collaborating with Razer's OSVR to build a VR headset with the Leap Motion sensors built right in.

Mercedes-Benz is far from the first automaker to experiment with self-driving cars, but it's making up for that in style at CES. Meet the F015 Luxury in Motion, a previously hinted-at concept car designed from the ground up for robotic transportation. Passengers normally sit face to face so they can talk more, and anyone can control the car through remote units and gestures (courtesy of Leap Motion) -- this is really a classic, luxurious carriage remade for the modern era, according to Mercedes. The vehicle even has color-coded LED lights on the front and back to let you know what the car is thinking. It can tell you whether or not it's in autonomous mode, or give pedestrians a heads-up when it's safe to cross. You probably won't ever see the F015 go on sale, but it's a good clue as to what Mercedes hopes you'll drive (or rather, not drive) in the years ahead.

Picture the scene: You're watching a documentary with an Oculus Rift headset, when suddenly you need a drink or receive a text. You have to stop the film, rip off the goggles and give your eyes a second to adjust, only to reverse the entire process when you're done a few moments later. Leap Motion, the hand-tracking accessory that can be used in parallel with Oculus VR hardware, has a solution. A new 'Quick Switch' demo lets you alternate between VR and a video passthrough simply by waving your hand in front of your face. It's a quick gesture and the proximity required from the headset (between one and three inches) means you're unlikely to trigger it by mistake. The company says Quick Switch will soon be available as part of its 'Unity Core Assets: VR Edition' toolkit and could be added to any Unity app. Video passthrough is already available with Oculus VR and Samsung's new Gear VR headset, and we suspect it'll come as standard in the final Oculus Rift consumer model. In the meantime, Leap Motion's offering could prove useful, especially if you're exploring the Large Hadron Collider and accidentally knock over a mug on your coffee table.

We'll bet you didn't wake up this morning wanting to experience what a particle goes through as it zooms along the massive tunnels of the Large Hadron Collider. Oh, wait -- you did? Well, game development studio Funktronic Labs is waaay ahead of everyone. The firm has already created a trippy first-person virtual reality journey through the LHC, as an homage to the Higgs boson-like particle it found back in 2012. You can interact with the kaleidoscopic world and control your virtual trek by waving your hands (and using pinch-and-pull motions) over a Leap Motion detector, which triggers visual changes on screen. But, if you're lucky enough to have an Oculus Rift at this point in time, you can use one for a more immersive (and very, very trippy -- because this needs to be said more than once) experience. Don't have either? Don't worry: head after the break and watch a preview video to at least get a glimpse of the psychedelic experience.

The Leap Motion controller is currently present in three forms: a $74.99 standalone dongle, inside the special edition HP Envy 17 laptop and inside an HP keyboard. The dongle -- with almost half a million units sold since launch -- and the keyboard are obviously the only ways to add this hand motion sensor externally, but the latter option was limited to select HP computers to begin with. Well, not any more. At Computex, Leap Motion told Engadget that as of this month, you'll be able to purchase said keyboard for about $99, and it'll work on any Windows 7 or Windows 8 PC as long as you have the software installed -- be it the current version or the free V2 update with skeletal tracking coming this summer.

The Leap Motion controller is a curious little motion sensor, but it isn't always easy to use. The hand-sensing tech has a tendency to lose sight of where your fingers are and almost every application that uses it has its own learning curve. Soon, that might change -- today Leap is launching the public beta for its next generation (V2) tracking software. This free update makes some big promises, including improved resistance to sunlight and infrared interference, better tracking algorithms and, best of all, the ability to track individual joints. We dropped by the company's San Francisco office to try it out and found the update to be a significant improvement.

If you're fluent in American Sign Language, congratulations: you know one more language than most of the people reading this post. The rest of us? A solution to our communicative failures is on the way. A company called MotionSavvy is building a Leap Motion-equipped tablet case that can actively interpret ASL and 'speak' the translation out loud. It's an ambitious project, but it works: at a recent Leap AXLR8R event we saw company founder Ryan Hait-Campbell sign over a MotionSavvy equipped slate. "Hello, my name is Ryan," he said. "What's your name?" It was an impressive demo, but Hait-Campbell admitted it was limited -- the setup can only recognize about 100 words at present, and since signs can vary slightly from person to person, those words don't consistently register for every user. Still, the company's prototype shows enormous potential. If the firm can outfit it with a larger word database and the ability to decipher personalized signing, MotionSavvy could become an incredible communication tool for the hearing-impaired.

Physical therapy isn't fun. It's a physical and emotional challenge that often consists of dull, repetitive tasks. It's boring, and offers patients almost no short-term rewards for their very real efforts -- but maybe it doesn't have to be that way. A new software platform called Visual Touch Therapy is trying to make physical rehabilitation fun, gamifying repetitive exercises by marrying a Leap Motion controller, a PC and a simple meme-inspired video game. The game itself is fairly simple: players perform simple motions over the Leap controller that cause a dog character to run (or fly a jetpack) across the screen, and their performance and improvement can be tracked, quantified and even sent to their physical therapist for review.

Neutralizing explosives, it turns out, is a delicate and complicated procedure -- but a company called Mirror Training hopes to make it simpler. "Our company has built an interface that literally uses your own hand and arm to move a robotic arm," announced CEO Liz Alessi. "I like to call it 'wear your robot.'" The interface uses a Leap Motion controller to detect an arm and hand movements, allowing a bomb squad robot to directly mirror its operator's actions. In tests, Alessi says, it has allowed operators to disarm mock-bombs twice as fast as traditional control methods.

There's never been a better time to just kick back and watch whatever you want, what with the many ways in which content can be consumed nowadays. And if you enjoy watching videos on Vimeo, things are about to get even easier. The company recently announced that its Couch Mode feature is now friendly with the Leap Motion controller, allowing you to take over media commands by simply using your hands. A circle gesture with your finger can fast-forward or rewind; tap your finger in the air to play or pause; and swiping gets you to the next video.

Chance Ivey, game design lead for Chaotic Moon's whimsical Oculus Rift demo SharkPunch, was only half-joking when he made that comment to me as I exploded a megalodon with my fist in virtual space. That's because the minigame, which incorporates a visor-mounted Leap Motion controller to let users punch sharks in 3D, actually has firm roots in an educational simulator the Austin, Texas-based company's been developing for prospective clients. Yes, that connection may be hard to swallow at first -- after all, how does a frenzied, and fun, game of shark carnage assist players with learning? The simple answer is that it doesn't, but by no means does that lessen SharkPunch's educational origins in the slightest.

If there was one ubiquitous item at NYU's ITP Winter Show, it was the Kinect. Countless projects were built around the Microsoft-made sensor. Max Ma's Touchless, which he built with a ton of help from Tony Lim, originally featured one, but the version that made it to the floor went with an OEM equivalent instead. But the effect is the same: a set of cameras and sensors track various parts of your face, turning your muscle twitches and eyebrow raises into raw data. While Max says this data can be used for a host of different applications, such as unlocking your door with a series of blinks and winks, he focused on bringing joy to people's lives through music creation. The sensor tracks between 16 and 64 points (under ideal conditions) on your face, and uses your movements to trigger and manipulate samples. Truth is, it's hard not to smile while making ridiculous faces, though, I was a little disappointed to find out that the tracker did not play well with my winter beard.

The main method of interacting is by tilting your head, opening your mouth and raising your eyebrows, but Max added some depth by turning a Leap Motion sensor into a controller for a software synthesizer. So samples and beats are all above the neck, but you can wave your hands through the air to play a lovely lead melody. Really, the whole thing is pretty self-explanatory and quite fun, as you can see in the video after the break.

HP's adding more Leap Motion to its PC arsenal. After refreshing its Envy 17 with inbuilt Leap Motion control, HP's now expanding that partnership to select products in its desktop and all-in-one lines. The 11 devices, which encompass HP's Envy Recline, Phoenix and TouchSmart series, in addition to its Pavilion TouchSmart line, will come packed with the Leap Motion keyboard. But that keyboard would be useless without the necessary Leap Motion software, which HP's made sure to pre-install on the devices, along with a few free apps to demo the gesture control and Airspace, Leap Motion's own app store. If a motion-controlled PC sounds like something you'd want to gift wrap and nestle under the tree, then be sure to check out HP's online shop for the full list of supported devices. Because nothing says, "Merry Christmas!" (or "Happy Holidays," or whatever) like expensive, gimmicky technology.

Many will tell you that video games are bad for your eyes, but James Blaha doesn't buy that theory. He's developing a crowdfunded virtual reality title, Diplopia, that could help restore 3D vision. The Breakout variant trains those with crossed eye problems to coordinate their eyes by manipulating contrast; players score well when their brain merges two images into a complete scene. Regular gameplay could noticeably improve eyesight for adults that previously had little hope of recovering their depth perception, Blaha says. The potential solution is relatively cheap, too -- gamers use an Oculus Rift as their display, and they can add a Leap Motion controller for a hands-free experience. If you're eager to help out, you can pledge $20 to get Diplopia, and $400 will bundle the app with an Oculus Rift headset. Check out a video demo of the therapeutic game after the break.

When we reviewed the Leap Motion controller earlier this year, we found the application selection to be a bit lacking. Since then, the number of apps has doubled from 75 to around 150, and the Airspace store's newest edition is the coolest Leap app we've yet seen. It's called Free Form, and it's a 3D sculpting app (not unlike 3D Systems' Sculpt) built in house at Leap Motion that lets you manipulate and shape digital objects using your fingertips. David Holz, company co-founder and the man who figured out the math behind Leap Motion's technology, gave us a demo of the app and talked a bit about why Leap built it. Additionally, he showed us a new developer beta software that does 360-degree tracking built to address some of the original Leap shortcomings.

The stage at Expand acted as a platform for conversations between many people involved in the future of technology -- but there was a lot more that attendees were able to enjoy. Some of our nifty workshop sessions had 3D Systems show off its new Sense scanner; littleBits gave a demo of the Synth Kit; Leap Motion talked about its SDK and the implementation of it on 3D web apps; and Raspberry Pi revealed the results of the Make-Off contest. These are only a few of the workshops from our event in New York City, so head past the break to check out the full list -- we've got a video for each one.

"Gone are the days, probably, of the keyboard, mouse and maybe even touch input," Samsung's Shoneel Kolhatkar told us. During a panel on the future of gesture and motion controls at Expand NY, Kolhatkar suggested that these technologies could fade away within the next 20 years. His fellow panelists, Pelican Imaging's Paul Gallagher and Leap Motion's Avinash Dabir agree that there's more to the future of computing than the traditional point and click.

You might say the day is never really done in consumer technology news. Your workday, however, hopefully draws to a close at some point. This is the Daily Roundup on Engadget, a quick peek back at the top headlines for the past 24 hours -- all handpicked by the editors here at the site. Click on through the break, and enjoy.