BeagleBone Green - Review

Table of contents

RoadTest: BeagleBone Green

Author: collie147

Creation date:

Evaluation Type: Development Boards & Tools

Did you receive all parts the manufacturer stated would be included in the package?: True

What other parts do you consider comparable to this product?: Raspberry Pi

What were the biggest problems encountered?: Out of date documentation

Detailed Review:

When I initially plugged in the board it booted and opened as a mass storage device.  I opened the README.htm from the Getting Started mass storage device. Initially there were a few errors setting up the drivers from the installer on the mass storage, so I went to the website and downloaded the drivers there (http://wiki.seeedstudio.com/BeagleBone_Green/) which worked flawlessly.  I tried this on a couple of other computers (Windows 8.1 and a Surface Pro 5) and neither had the same issues with drivers, but this was after I had updated the image on the BeagleBone to the latest build.

I noticed that the LEDs are labelled differently (and incorrectly) on the README.htm but correct (using the diode numbers) on the website, this was rectified after updating the OS to the latest build.

Less than 10 minutes to get started from unboxing (which could’ve been shorter I didn't have driver issues).

I clicked on the address for the local beaglebone which opened up http://192.168.7.2/bone101/Support/bone101/

One of the first interfaces described on the page utilises Javascript, with a little window to execute code to write to the onboard LEDs. Using this window on the page it was easy to update the onboard LED flashing pattern.  It doesn't feel the same unless you wire up your own LED to the headers and make it light up, so I connected a LED and a resistor to one of the GPIOs and I managed to get it to turn on and off using the javascript example very quickly.

After that I gave the Cloud9 IDE http://192.168.7.2:3000 a try.  This gave a number of options like a terminal and a debug window, adding default javascript libraries etc.

I’m not too familiar with node.js, particularly with using a delay/sleep function so I opened the console (also accessible using putty or another ssh client using the username: debian and the password: temppwd) and started python, imported the Adafruit_BBIO library and wrote a while loop to blink the LED really quickly.  Within 20 minutes of opening the box (with some fiddling with node.js and the bonescript) I had a flashing led. After wards while I was looking through the examples on Cloud9, I saw that there is a couple of LED blink node.js examples under the PocketBeagle examples.

Now that I had the really basics done, I thought, lets update this to the latest kernel, which you might think is daunting, but it couldn't be simpler.  As the version of Debian on the shipped BeagleBone Green was from January 2018 -Debian 9.3 and I think the kernel was 4.9 (I really should’ve noted that down), I downloaded the flasher image from http://beagleboard.org/latest-images and being familiar with putting raspberry pi images on an SD, I would normally unzip the .img file and use Win32DiskImager to put the image onto an SD card, but the instructions (http://192.168.7.2/bone101/Support/BoneScript/updates/) recommended Etcher so I thought I’d give that a try. 

Etcher unzips, writes to the SD and by default verifies the contents, so it takes a little longer than Win32DiskImager, but its probably better in practice if you’re updating on-board memory to make sure its right. Updating the eMMC took about 10 minutes (the onboard LEDs flash like a Cylon/KITT from Knight Rider/a Larson scanner) and rebooted just fine. 

Next step was to give the BeagleBone network access so I could update it and install packages as necessary, if you’re thinking “Can I do this over usb sharing in windows 10”, the answer is yes, you can!

After running into a couple of issues a quick search turned up this site which worked a treat

http://lanceme.blogspot.com/2013/06/windows-7-internet-sharing-for.html

Considering I might want to use this as an IoT device in the future (node.js and a pre existing web server makes it a nice fit for interacting with hardware remotely), I wanted to add WiFi to the BeagleBone Green using a cheap usb wifi and found instructions for setting up here

https://www.digikey.com/en/maker/blogs/2017/how-to-setup-wifi-on-the-beaglebone-black-wireless (but I needed to change the gateway from the usb shared network access from earlier) and found on this site how to make it permanent https://elinux.org/BBBWiFiConfigs

To avoid any accidental short circuits I thought I’d print a case for the BeagleBone Green, a quick search on Thingiverse (https://www.thingiverse.com/thing:1323538) and 2 hours later I had a decent case.

I had a 2.0” TFT laying around and another google search threw up a pre existing python library (https://github.com/firnsy/BHack_Python_ILI9225) that could be used to display it.  There was an issue with the library that required changes to one of the PY files and a re-install (the dreaded mixing of tabs and spaces for indenting).  After messing around with it for a while with no success, I thought that SPI might have to be manually enabled (even though /dev/SPIDEV were visible). So I made a change to the /boot/uEnv.txt to make sure that SPI was enabled at boot adding the line-

uboot_overlay_addr4=/lib/firmware/BB-SPIDEV0-00A0.dtbo

Once that was done, it ran straight away.

image

Now to programming it to show some useful data (not just a random resized jpeg).

I worked on a script for a Raspberry Pi a while back to use it as a digital photo frame displaying photographs from google photos, but couldn’t get it to work for whatever reason.  I thought maybe give it another try on the BeagleBone. I had previously enabled google photos to display as a folder on my Google Drive and found a python Library, PyDrive, and decided this might be the ticket.  After setting up a Google Dev account and a project I found out that the library uses OAUTH2.0 and requires a computer with a UI to login first to generate the credentials. Once this was done I copied the credentials as a text file to the BeagleBone and ran it.  After a few more teething problems and a little exception handling (some of the photos are 8000+ pixels in width, resizing and reorienting them for the small screen caused some out of memory errors) I had it working (somewhat) smoothly. It later became apparent that there is an API for google photos which made the whole thing simpler.

All scripts are here https://github.com/Collie147/BeagleBoneDisplayILI9225

Finally I thought I’d test how well the BeagleBone Green reads analog voltage.  I noticed that there are some python libraries for utilising the BeagleBone’s Programmable Real-time Units (PRUs) so I thought maybe I could set this one up as a basic oscilloscope.  I have an old Kenwood CS4025 20Mhz one laying around that gets pulled out from time to time, so maybe I could compare the two to each other and see how well it does.

I threw together a python script to read the ADCs and draw a line on the screen.  Drawing the screen takes roughly about .08 of a second so reading and writing needs to be done in chunks to get anything done.  I thought reading about 20 samples and writing that to the screen might give me something useful. Timing each loop the process ran about .1 of a second so its about 200hz and a screen refresh rate of 10/s making it somewhat usable.

It doesn’t capture any samples while it draws the screen so its not perfect, but this is where the PRUs come in handy.  From what I’ve discovered, the PRUs are a separate chip, they have access to some of the pins and they can carry on with functions while the main CPU looks after its own stuff, so they’re essentially an independent processor.  So I thought if I could get the PRUs to sample one of the ADC pins, buffer the samples and pull the information into my python script, then I’m only displaying what the PRUs capture (within say .1 of a second) and the CPUs job is really just to update the screen.  While the screen is updating the PRUs are still capturing (at a much higher rate than is possible with standard ADC read functions as far as I can tell), so there is nothing missed. Sounds simple(ish), right?

I had read about a project a while back that opened the PRUs out to python and could possibly be used as an oscilloscope - https://github.com/pgmmpk/beaglebone_pru_adc. Thinking it would actually be that easy, I cloned it and set to work.  This turned out to be a red herring, as it was totally out of date. This is not going to be that easy! To find out more about the PRUs I checked the PRU page on Beagleboard.org.  The page is marked as still in development, but it looks like some of the links don’t work (the PRU programming examples) and the youtube videos are out of date, although they do give some useful background information on how the PRUs work (they are long by the way). Essentially the page and the videos haven’t been updated since late 2014, so its out of date for the current version of Debian for the BeagleBone (4.14.71 as of the time of writing) and all of the libraries that are referred to as well (https://github.com/deepakkarki/pruspeak/issues/9). 

After some googling work and trying to compile the drivers (unsuccessfully) I thought I’d have a look at the uEnv.txt file and of course the PRU device tree is already enabled using the pru_rproc drivers but there are others there for the pri_uio drivers (which apparently are the older ones).

The first test of the PRUs that actually worked I found here

https://gist.github.com/jadonk/2ecf864e1b3f250bad82c0eae12b7b64

But this requires some C programming with chip instructions, which is then compiled into assembly for the TI chips to read from the pin and output it somewhere.  I haven't programmed chips using native C in a long time and even then it wasn't something I've done more than a handful of times. There must be an easier solution.

Enter BeagleLogic.  This is a turnkey logic analyser solution.  You can buy a cape or use it as is (with some voltage dividers on the pins to keep the voltage under 1.8v).  It’s not a scope, but it’s better than reading analog in voltages in python! Setup was simple enough (using the image provided) and although you can run commands over SSH to capture, there is a web interface that provides a GUI to get started.  To cut down on time I decided to use the web interface to capture a basic square wave produced by an Arduino nano I had sitting around. I set up a voltage divider of 1K Ohm and 470 Ohm resistors which @5v gives just under 1.6v out. I set the Arduino up to write to a PWM pin and set it at certain levels to see the effect it would have. 

image

Setting the value to 127 produced the below at 100,000kHz (it's a little small bit you get the idea)image

If I understand it correctly that each vertical line is a sample, then it’s about 102 samples per cycle which works out about 980Hz which is about right for the arduino PWM frequency (as far as I understand it).

To test if it BeagleLogic measured voltage I input 3.3v instead of 5v, which brought it down to just over 1v, this should’ve measured a drop of about ⅓ but the voltage stayed the same on the output, so I can only conclude that BeagleLogic is just a logic analyser (high and low rather than analog voltages).  After a bit more research it turns out it does sample analog voltages, but not without a cape such as the Prudaq.

image

My own python script however was not up to scratch.  The analog read function (whether that's the sample rate or the library I’m not sure) isn’t able to keep up with the high duty cycle of the pwm, or the square wave unless it’s dropped down to under 10Hz.

(Excuse the mess on my desk, I'd like to say it was a once off, but its always like that!)

To make it easily configurable, I wrote a quick sketch to output a square wave from the Arduino, nothing too fancy just highs and lows and checked it with both BeagleLogic and my scope.  Both registered about the same duty cycle, although it was easier to calculate (and calibrate) using BeagleLogic than my old scope, but there’s something nice about being able to see the output on a screen as it happens.

My tests were by no means exhaustive, I planned on getting out my signal generator and trying that but because of the fact that BeagleLogic outputs just logic, I thought it pointless as it wouldn't show the waveforms. 

All in all, I spent a good bit of time trying to set up the PRUs (probably about 18 hours in total spaced over a couple of weeks, although my searches concentrated on python integration), going down one rabbit hole only for it to be out of date, obsolete or it was no longer maintained and wouldn’t compile.  As far as I could tell there wasn't a whole lot of information about accessing the PRUs on the current kernel version, or even a user friendly interface to access them (that I could find). In my humble opinion, the BeagleBone’s failing is that community support for the device isn’t (currently) as wide as it is for something like the Raspberry Pi, even the official instructions are in desperate need of updating (even though the latest download image is over a year old).  I’ve dug deep into Raspbian, stuff like editing drivers and compiling from source (which for me is fairly hardcore but to linux aficionados its probably fairly standard stuff), but I've found there has always been a huge amount of community support for the Pi, for both software and hardware. Sure there are the google groups for the BeagleBone, there’s a fair bit on stackoverflow and various blogs, but they tend to be out of date (some by quite a bit). That said, the most help I received was on the BeagleBone IRC (I haven't used IRC channels in over a decade). Within about a half an hour of posting, I received an answer and a github page to a python library that utilised the UIO PRUs.  The examples ran fine but trying to set it up to run a capture on a pin, write it to a buffer and use the little LCD screen I had would have been a huge amount of effort, especially as I only had limited time to devote to this write-up and I wasn’t sure that python could render an image of the sampled waveform legibly and smoothly. 

I could see the PRUs being the killer feature to set the BeagleBone apart from the Raspberry Pi (and others), but not without a beginner friendly library for common languages (e.g. python, node.js) or a UI, or some wrapper of some description other than native C. I’m trying not to slate it too much, but how the PRU information page on the official site is nearly 5 years out of date is baffling.  I understand how hard it is to keep things updated and fresh, but the BeagleBone Green (and the BeagleBone Black on which it’s based) have been around a while and are mature enough that, personally, the PRUs should have a more beginner friendly way of utilising them on an official BeagleBone image (I’m open to correction here on anything stated, I want to be proven wrong, particularly with regard to the PRUs).

Overall I found there was a lot of sources of good information and libraries available for the BeagleBone Green (libraries for BeagleBone Black are largely compatible), I found a library for my LCD very quickly and a printable case with ease.  It's a fairly well established platform with a decent size user base for support to be maintained. I also found that the Cloud9 IDE is a decent enough platform to learn on too, so I could see this being utilised as a learning/teaching tool for all levels of coding and as an intro to an SoC Linux Dev board, it’s a really good package that I definitely didn’t spend enough time on (and I intend on exploring it more when I can carve out some more time).







Anonymous