Well, the time has come (passed really) to post about our projects for the Path II Programmable training program. rscasny has been gracious enough as to not strictly enforce the project end date, so thank you Randall!
I finished the training material a little later than expected, and between getting sick, work picking up before the holidays, and the holidays themselves (Happy New Year!), I had significantly less time to devote to the actual project than I had hoped. This will be my last formal Path II Programmable blog, but I am planning to add updates in the future as I make more progress. There is still much to be done, and I estimate it would take at least a couple more weeks of full-time dedication to get the project to where I wanted.
PROJECT CONCEPT AND HARDWARE
My idea for this project was to pair the capability of the Ultra96-v2 board with some pieces of external hardware, in the form of a roving, mapping robot with computer vision. One piece of external hardware is a MakeBlock Ultimate Robot Kit (https://www.makeblock.com/project/ultimate-robot-kit) which I've had for a while. It has an Arduino compatible controller and comes with a variety of peripherals which can be plugged in to the controller with RJ-11 jacks. Some of the peripherals include a Bluetooth module, a IR sensor module, sonar distance sensor module, addressable RGB-LED strip and several motors for drive and articulation of an arm and claw. The bot is powered by a pack of AA batteries. The second piece of external hardware is a 2nd generation Kinect sensor for Xbox One. The main components contained by the Kinect sensor are an RGB camera, IR projector array, and an IR camera. My plan for the project is to use the robot as a platform for the Ultra96 and the Kinect sensor, with data from the Kinect being streamed to the Ultra96 over USB, and used to map the surrounding environment, detect objects, and help the robot navigate. One shortcoming of this hardware setup is the Kinect sensor is not very portable. In order to interface over USB with something other than an Xbox (like a PC or Ultra96 board), a power and data converter brick are needed, which adds to the already bulky form factor of the Kinect sensor.
Source: https://support.xbox.com/en-US/xbox-one/accessories/kinect-adapter
The power brick plugs into A/C power and provides a 12V supply. The data brick combines the 12V power with the data from USB into a proprietary plug, which then runs to the Kinect sensor. When combined with the power brick used for the Ultra96 board, this setup got clunky very quickly. Of course, it's never ideal having a robot tethered to A/C plugs. This was just a temporary, development setup. At some point, I would like to power the whole thing off a hobby/RC battery, with one or more regulators to supply the Ultra96, Kinect and the robot kit. There are some examples of people hacking the Kinect proprietary cabling to eliminate the converter brick and supply power with a DC barrel jack. One of my future goals for the project as I revisit it later is to address all this and clean up the hardware setup.
ROBOT PROGRAM
I started out by writing a program to run on the robot controller which waits for a serial command to control movement. The robot's Bluetooth module connects directly to the controller UART pins which allowed for commanding of the robot from the Ultra96 over Bluetooth. I wrote a basic command set, including forward, reverse, left, and right, as well as commands for setting motor speed and delay the robot waits before processing a new command (i.e. how long the motor runs). I used the Arduino IDE and MakeBlock libraries, as this was the quickest way to get up and running.
ULTRA96-V2 BLUETOOTH
Getting the commands and Bluetooth working on the robot side was fairly easy to do. I was able to test out sending commands from my computer and a Bluetooth serial terminal on my phone. I ran into some trouble on the Ultra96 side when trying to configure the built-in Bluetooth and connect a serial port to it. I started with the factory Ultra96 PetaLinux image, but could not seem to get the Bluetooth configured and running. I was not the only person to have an issue with this, as someone else asked a question about it on the forum (https://www.element14.com/community/thread/73916/l/bluetooth-on-ultra96v2). A recommendation by (lightcollector) was to use the Pynq 2.5 image, as it did not seem to have the same issue. I did get the built-in Bluetooth to come up and was even able to scan and detect the robot's Bluetooth, however I have not successfully attached a serial port to communicate to the robot. At the time of this writing, I realized I may not have been setting up the serial console parameters correctly. I now remember back to the labs which mentioned the built-in Bluetooth uses a serial connection with flow control (clear to send - CTS / request to send - RTS), instead of a basic UART. Currently, I don't have hardware access (I'm on a plane), so I can't go back and check. When neither of these worked, I decided to try a Cinolink USB Bluetooth module I had used before with Raspberry Pi projects. This was essentially plug and play except for attaching the serial port. I was indeed able to attach the serial port and send commands to the robot (see video below).
I sent commands from a terminal and also was able to use Python to send the commands, so there are several options available for this once I get the connect side working later.
PYNQ
I wasn't originally planning to use PYNQ, but since it was suggested to fix the Bluetooth issue, and it seems to be set up better for modifying the Zynq programmable logic, I decided to use PYNQ instead of PetaLinux. PYNQ loads an update to programmable logic as an overlay without having to rebuild the entire image. This can be seen in this tutorial: https://www.youtube.com/watch?v=LoLCtSzj9BU. I imagine this could be done with a PetaLinux image, but I couldn't find any examples of it being done. I was drawn to this because of the convenience. The amount of time consumed re-building a PetaLinux image with bitbake is not insignificant, and it seems excessive to go through those steps when you're not necessarily changing anything related to the OS or software.
Another plus to the PYNQ image is the availability of "apt-get" which helped with installing package dependencies for the Kinect sensor.
KINECT SETUP
There are some open-source projects for accessing data from Xbox Kinect sensors on a non-Microsoft platform. For the original Kinect, there is "libfreenect", and for Kinect One, there is "libfreenect2". I was able to follow the setup somewhat successfully here (https://github.com/OpenKinect/libfreenect2/blob/master/README.md#linux), and run the available demos. I mostly followed instructions for Ubuntu 16.04 or "Other" where applicable, and followed the Mali instructions for OpenCL. There may be some things specific to Zynq which are required, as the demos I tried did not perform very well.
As seen in the video of the demo, the data stream had quite a bit of lag. Frame rate appeared to be no more than about 1 Hz, and there was significant delay from the time of camera input to the time of corresponding update on the display. The default demo is a program called "Protonect". When run, along with the laggy stream, I received several debug/info messages tagged "[DepthPacketStreamParser]" in the Bash terminal indicating issues with the stream, including "X packets were lost", "skipping packet depth", "not all subsequences received", "subpacket too large", and "image data too short!".
Protonect terminal messages
I decided to try another library called "pylibfreenect2" which is a Python extension for "libfreenect2". Once installed, I tried running a Python demo similar to "Protonect". The frame rate seemed to be slightly improved, maybe 1-2 Hz max, but nothing that seemed very useable. When I get time to revisit the project, I will have to see if there is anything missing from the Kinect tools library which may be needed specifically for Ultra96-v2.
FUTURE PROJECT TO-DOS
All in all, the project has not been as fruitful as I hoped, but that is mainly due to having less available time to devote to it than originally planned. I certainly want to continue working through this project in the future in order to learn more about robotics and computer vision.
- Clean up hardware setup - I don't want the robot to be tethered or contain so many power/data bricks
- Get built-in Bluetooth working - I'm sure this is possible, just haven't quite gotten there
- Work on implementing simultaneous localization and mapping (similar to the following: http://wiki.ros.org/gmapping) using the Kinect sensor - this will take some doing, as I've currently only gotten the demo to work, and its performance is sub-par at best
- Stream robot sensor data so it can be viewed in a web-browser
- Try offloading some mapping algorithms to the programmable logic for hardware acceleration
So there we are. Thanks again to rscasny and the project sponsors for this opportunity! I learned a lot, and look forward to continuing more on this project in the future.
Top Comments