Introduction
The Leap Motion Controller is something that can be used as an input device for a computer (desktop/laptop or a single board computer). It can be placed on the desk, or if desired can be strapped onto a virtual reality (VR) headset if you have one. In brief, when it sees your hand or hands, it can analyse them to figure out where your fingers, palms and so on, are in three dimensional space. It can also figure out certain movements such as swipes. You could use it for controlling stuff, or even controlling Python games.
It is only just bigger than a USB memory stick, so it is quite compact.
Here is a snippet of the possibilities, from the Leap website:
I recently noticed Leap Motion controller devices cheap on ebay, so I was curious to try it out. Knowing nothing about them, I was kind-of hoping to somehow get it functioning with ARM boards such as a Pi or BeagleBone Black, but sadly it appears that internally they contain little processing; all the sensor data is streamed via USB to a PC, which does the heavy processing, although apparently Android support is coming soon. Internally, according to online teardowns, there are a couple of CMOS cameras with fish-eye lenses, and a few infra-red (IR) LEDs, and a Cypress chip to stream all the captured data to the PC. The top surface is a plastic IR filter. With the fish-eye lenses, the Leap Motion Controller gets to see a space about the shape of a hemisphere of 600mm radius which is pretty huge. There is more information about how it works on the Leap website.
Image source: Leap website
Anyway, to get it running on a PC, the desktop core software and app-store download is pretty huge (400MB+), and frankly I wasn’t really interested in running other people’s apps. They are focused toward gaming, desktop actions and so on, but I get the feeling a lot of the ready-made apps are just for the novelty factor today.
Writing your own code and controlling your own games or devices sounded like more fun, and for that, there is no need to download the huge core software app bundle. Instead, a software developer kit (SDK) can be downloaded.
The developer kit is supported on Windows/Linux/Mac, and as mentioned Android should be coming soon.
Anyway, I fired up Ubuntu and tried it out! I was primarily interested in creating apps with JavaScript and Node.js, so that’s what this blog post will discuss.
The notes are below, they are not necessarily complete, but contain all the important stuff to get started. If you spot any errors or have any suggestions, please let me know.
What is Needed?
Apart from the Leap device (which comes with USB cables), just a x86 or x86-64 PC or single board computer. This works on any reasonable PC. The Leap website suggests an Intel Core i3 or upward processor, or AMD Phenom II upward, and 2GB of RAM, I tested it on a laptop which met this requirement, but I also tested on a Gizmo2 (which sadly isn’t sold any more) – and that has just a low-power AMD chip and 1GB of DRAM! It seems to work fine in my limited tests.
The Gizmo2 and similar compact boards allow for the Leap Motion device to be used for embedded or kiosk applications. Sadly the Intel IoT boards like Joule are end-of-sale too, otherwise it would have been really neat to try that. I think it would have run really well on that!
Getting Started: Install Linux
I installed Ubuntu 16.04, and then as root user in a terminal window proceeded to type the following:
apt-get update apt-get upgrade
Install the Leap Software Developer Kit (SDK)
The Linux SDK can be downloaded from: https://warehouse.leapmotion.com/apps/4186/download
Create a folder to store all your leap stuff. From my home folder, I typed:
mkdir -p development/leap cd development/leap
Download the SDK as mentioned, and then extract as follows:
gunzip Leap_Motion_SDK_Linux_2.3.1.tgz tar xvf Leap_Motion_SDK_Linux_2.3.1.tar cd LeapDeveloperKit_2.3.1+31549_linux
Check if you’re running 32-bit or 64-bit Linux:
uname -m
If the output says x86_64 then that means you’re running 64-bit, also known as x64.
Install the appropriate extracted SDK (either 32-bit which has ..x86.deb suffix, or 64-bit which is …x64.deb suffix):
su dpkg –install Leap-2.3.1+31549-x64.deb
The previous command will complete, but with an error about leapd.service not starting.
Follow the instructions here: https://forums.leapmotion.com/t/tip-ubuntu-systemd-and-leapd/2118
Start up the service using:
systemctl start leapd.service systemctl enable leapd.service
If that ‘start’ command works, but the ‘enable’ command gives an error, don’t worry, that will be resolved in a while.
Install Node.js
As root user:
apt-get install nodejs ln -s /usr/bin/nodejs /usr/bin/node apt-get install npm
Install the Leap JavaScript API and examples
As root user:
npm uninstall bufferutil npm install -g bufferutil npm install -g leapjs
Still as root user:
apt-get install git
Note that if the above command complains about a leapd file existing, then type:
rm /etc/init.d/leapd
Then re-issue the apt-get install git command, and select Y to reinstall it if prompted.
Go to the development/leap folder, and as non-root user, download the examples,
git clone https://github.com/leapmotion/leapjs.git cd leapjs/examples
Then, as root user:
npm install -g gl-matrix npm install -g ws npm install -g http-server npm install gl-matrix npm install ws
You’re almost done!
However, install some interesting examples by going to the development/leap folder and typing (as non-root user):
git clone https://github.com/leapmotion/javascript.git
This creates a folder called javascript, which contains some nice examples
Running an Example
Plug in the Leap Motion device, and confirm the service is running by typing:
systemctl is-active leapd.service
If it is not active (e.g. crashed), start it up using (as root user):
systemctl start leapd.service
As non-root user, type:
cd javascript/v2/connected-fingers http-server
Now open up a browser and navigate to http://127.0.0.1:8080/index.html
You should see a black screen.
Hover your hand over the Leap Motion device to see it appear in the browser!
In the terminal, press Ctrl-C to quit at any time.
Here is a 30-second video of me messing about with it:
Creating your own Example
To create a custom example, I didn’t currently care about graphical output. I wanted to be able to get information that I could use to directly manipulate my computer, or to manipulate attached hardware. For example, it should be possible to swipe to turn on a light! Or control a MIDI synthesizer.
So, I took some example code and modified it to use the ‘gesture’ API based on an example on the Leap website. This would allow me to know if the had was doing things like swiping, circular motions, pressing keys in mid-air and so forth.
Go to the development/leap folder, and then type:
cd leapjs/examples
In this examples folder, there is a program called node.js that doesn’t do much except indicate frames received.
I made a copy of that program (I named the copy gesture.js) and edited it in the same folder. Here is the entire code:
#!/usr/bin/node require('../template/entry'); var controller = new Leap.Controller() controller.on("frame", function(frame) { //console.log("Frame: " + frame.id + " @ " + frame.timestamp); }); var frameCount = 0; controller.on("frame", function(frame) { frameCount++; if (frame.valid && frame.gestures.length>0){ frame.gestures.forEach(function(gesture){ switch(gesture.type){ case "circle": console.log("Circle gesture"); break; case "keyTap": console.log("Key Tap gesture"); break; case "screenTap": console.log("Screen Tap gesture"); break; case "swipe": console.log("Swipe gesture"); break; } // end switch }); // end frame.gestures.forEach } // end if (frame.valid }); setInterval(function() { var time = frameCount/2; console.log("received " + frameCount + " frames @ " + time + "fps"); frameCount = 0; }, 2000); controller.on('ready', function() { console.log("ready"); }); controller.on('connect', function() { console.log("connect"); }); controller.on('disconnect', function() { console.log("disconnect"); }); controller.on('focus', function() { console.log("focus"); }); controller.on('blur', function() { console.log("blur"); }); controller.on('deviceConnected', function() { console.log("deviceConnected"); }); controller.on('deviceDisconnected', function() { console.log("deviceDisconnected"); }); controller.connect(); console.log("\nWaiting for device to connect...");
You can use the diff command to see what I changed; it wasn’t much!
To run this example, as non-root user, first set the file permission:
chmod 755 gesture.js
Then, just type:
./gesture.js
It will stream on the terminal text output, whenever the hand is swiped etc. It is really basic, but demonstrates how to capture gestures in order to do something with them. Currently it just logs to the console.
Summary
With little effort, and low cost, it is possible to capture three-dimensional hand and finger motions into the computer, and do things with them! Unfortunately today this needs a PC or SBC with x86/x64 capability, but hopefully in the future this is possible with ARM SBCs too.
As next steps, I might try to get this to control external hardware. It would be great to hear about other people's experiences with this device.
Top Comments