Introduction

The Leap Motion Controller is something that can be used as an input device for a computer (desktop/laptop or a single board computer). It can be placed on the desk, or if desired can be strapped onto a virtual reality (VR) headset if you have one. In brief, when it sees your hand or hands, it can analyse them to figure out where your fingers, palms and so on, are in three dimensional space. It can also figure out certain movements such as swipes. You could use it for controlling stuff, or even controlling Python games.

It is only just bigger than a USB memory stick, so it is quite compact.

Here is a snippet of the possibilities, from the Leap website:

I recently noticed Leap Motion controller devices cheap on ebay, so I was curious to try it out. Knowing nothing about them, I was kind-of hoping to somehow get it functioning with ARM boards such as a Pi or BeagleBone Black, but sadly it appears that internally they contain little processing; all the sensor data is streamed via USB to a PC, which does the heavy processing, although apparently Android support is coming soon. Internally, according to online teardowns, there are a couple of CMOS cameras with fish-eye lenses, and a few infra-red (IR) LEDs, and a Cypress chip to stream all the captured data to the PC. The top surface is a plastic IR filter. With the fish-eye lenses, the Leap Motion Controller gets to see a space about the shape of a hemisphere of 600mm radius which is pretty huge. There is more information about how it works on the Leap website.

Image source: Leap website

Anyway, to get it running on a PC, the desktop core software and app-store download is pretty huge (400MB+), and frankly I wasn’t really interested in running other people’s apps. They are focused toward gaming, desktop actions and so on, but I get the feeling a lot of the ready-made apps are just for the novelty factor today.

Writing your own code and controlling your own games or devices sounded like more fun, and for that, there is no need to download the huge core software app bundle. Instead, a software developer kit (SDK) can be downloaded.

The developer kit is supported on Windows/Linux/Mac, and as mentioned Android should be coming soon.

Anyway, I fired up Ubuntu and tried it out! I was primarily interested in creating apps with JavaScript and Node.js, so that’s what this blog post will discuss.

The notes are below, they are not necessarily complete, but contain all the important stuff to get started. If you spot any errors or have any suggestions, please let me know.

What is Needed?

Apart from the Leap device (which comes with USB cables), just a x86 or x86-64 PC or single board computer. This works on any reasonable PC. The Leap website suggests an Intel Core i3 or upward processor, or AMD Phenom II upward, and 2GB of RAM, I tested it on a laptop which met this requirement, but I also tested on a Gizmo2 (which sadly isn’t sold any more) – and that has just a low-power AMD chip and 1GB of DRAM! It seems to work fine in my limited tests.

The Gizmo2 and similar compact boards allow for the Leap Motion device to be used for embedded or kiosk applications. Sadly the Intel IoT boards like Joule are end-of-sale too, otherwise it would have been really neat to try that. I think it would have run really well on that!

Getting Started: Install Linux

I installed Ubuntu 16.04, and then as root user in a terminal window proceeded to type the following:

Hover your hand over the Leap Motion device to see it appear in the browser!

In the terminal, press Ctrl-C to quit at any time.

Here is a 30-second video of me messing about with it:

Creating your own Example

To create a custom example, I didn’t currently care about graphical output. I wanted to be able to get information that I could use to directly manipulate my computer, or to manipulate attached hardware. For example, it should be possible to swipe to turn on a light! Or control a MIDI synthesizer.

So, I took some example code and modified it to use the ‘gesture’ API based on an example on the Leap website. This would allow me to know if the had was doing things like swiping, circular motions, pressing keys in mid-air and so forth.

Go to the development/leap folder, and then type:

cd leapjs/examples

In this examples folder, there is a program called node.js that doesn’t do much except indicate frames received.

I made a copy of that program (I named the copy gesture.js) and edited it in the same folder. Here is the entire code:

It will stream on the terminal text output, whenever the hand is swiped etc. It is really basic, but demonstrates how to capture gestures in order to do something with them. Currently it just logs to the console.

Summary

With little effort, and low cost, it is possible to capture three-dimensional hand and finger motions into the computer, and do things with them! Unfortunately today this needs a PC or SBC with x86/x64 capability, but hopefully in the future this is possible with ARM SBCs too.

As next steps, I might try to get this to control external hardware. It would be great to hear about other people's experiences with this device.