Leap Motion controller (via Leap Motion)
The Leap Motion interface is just a couple of weeks away from sitting on your desk. Fortunately, unlike other secretive Natural User Interface (NUI) manufacturers, the makers of the Leap released all necessary information to developers months ago and have distributed around 12,000 units to them. This way, come launch day, there will already be tons of added functions and applications to use with the Leap.
Developer groups have hacked the Leap to use it as gesture interface to control quadrocopters and create augmented reality experiences by connecting them to smartphones. Google has also jumped into the Leap hacking game. In celebration of Earth Day, Google Earth announced it now supports the Leap so users can explore the world with natural hand gestures.
Stanford developer, electrical engineer and leader of the Human-Machine Technologies Organization, Bryan Brown, is taking Leap hacks one step further by developing a “middleware” applications that allows all sorts of NUI tech, like the Kinect, Creative Gesture Camera, the Leap Motion and others, to interface with off-the-shelf control hardware. Brown believes that since most of our natural communication is done without physical touch, voice or gesture interfaces will find their natural place in electronics.
This middleware app is called NUILogix and it does exactly what the name says: provide the logic necessary to interface NUI tech to machine control hardware using Mobus/TCP and OPC-UA protocols. Of course, the control hardware is then used to control machinery.
A few applications of the NUILogix have already been demonstrated. Brown used this middleware and a Leap to control an OWI 535 robotic arm by using the app’s controls that appear on screen. Using simple gestures he could move the arm up and down, side to side and even pick up balls and put them in cups.
The gestures used to control the arm were not entirely intuitive because they were used to adjust on-screen controls. However, another of Brown’s hacks does make use of intuitive gestures to control a toy RC boat. Holding you hand above the Leap pad and tilting it down, will make the boat speed up as if you were pressing an invisible gas pedal. Tilting the hand up will cause the boat to decelerate and tilting to the right or left can turn the boat in that direction.
Gesture and touchless interfaces could have many uses like keeping surgeons or nurses from touching certain tools. People with disabilities could use them to control machines and hardware without having to make intricate, forceful or meticulous finger movements. The Leap has raised so much excitement that HP has announced it will integrate it into certain devices in the future.
NUILogix is being developed by Brown and others at the Human-Machine Technologies Organization as part of the Intel Perceptual Computing Challenge. The Leap Motion is currently taking pre-orders from July 22nd release, and it will also be a Best Buy exclusive on July 28th (2013).
C
See more news at: