This very short blog post briefly discusses a simple project that allows for toy robot arms to be controlled using gestures. It had a serious purpose; it was intended to show latency.
However, it was also a bit of fun. There is immense pleasure in destroying Lego structures with a mere wave of the hand! There isn't a lot of documentation for this project here - this blog post is just to provide some ideas. To be honest it was quite straightforward, and components of it are documented in various places but some coding is required and is an mini exercise for the reader. Any questions, please ask in the comments section!
The whole thing is very similar to the robot arm demonstration at Amazon AWS Re:Invent 2015 by their Deep Learning director, Dr. Matt Wood.
To build this project, there isn't a lot of soldering. Just a few header pins. Everything was off-the-shelf. The main components were the robot arms (from Amazon), a couple of Raspberry Pi's (one for each arm, although a single Pi could be used in theory), a couple of Adafruit Servo HAT boards, and a Leap Motion sensor and a PC (e.g. a laptop).
The photo here shows the robot arms connected to the Pi's, and at the bottom of the photo you can see the Leap Motion sensor.
Check out the 30-second video here:
To install the Leap Motion software development kit (SDK), the instructions are here: Virtual Reality, Leap Motion, and Controlling Things! - Getting Started Guide
Next, decide how you want to control the thing. Use the code at Amazon's github page as a guideline: https://github.com/aws-samples/simplerobotservice
Alternatively the code could be developed in any of several languages. In my case, I used JavaScript. I installed Node.JS, and used Socket.IO to communicate from the PC to the Raspberry Pi. I used JavaScript and Node.JS to control the robot arm using the Servo HAT. The code isn't shown here, but it is a fun learning exercise to try to create it if you have a day to experiment - give it a try!
Top Comments