BeagleBone AI - Review

Table of contents

RoadTest: BeagleBone AI

Author: clickio

Creation date:

Evaluation Type: Development Boards & Tools

Did you receive all parts the manufacturer stated would be included in the package?: True

What other parts do you consider comparable to this product?: Raspberry PI + Google Coral / Raspberry PI + intel neural stick / nVidia Jetson nano SBC

What were the biggest problems encountered?: heating and getting latest versions of machine learning libraries on the board

Detailed Review:

Currently work-in-progress:

Part 4: Machine-learning libraries on BBAI: unoptimised vs optimised

Part 5: Machine learning benchmarks

Part 6: Conclusions

Part I: The Zero-download install

I have worked with SBC's before and I was intrigued, to say the least, of the claim that BBAI could be ready to go in 5 minutes. I had to test that for myself and, I have to admit, it was.

 

Traditionally SBC's  require the user to download and burn an OS image to an SD card before booting the mini-computer. And then, you have to start the SBC with the card, boot up the os installer, configure the system, reboot, set up the connection, etc... it's not exactly a 5-minute job.

 

You could do the same with the BBAI if you wanted to, you have that option, but it is not necessary. Here are the steps for the BBAI zero-download, zero-frustration setup:

 

Unboxing

You get it out of the box and admire it in all its glory.

imageimageimage

 

Boot process

You boot it up just by plugging in a powered USB-C cable, from the PC/laptop, from a 5V powerbank or from a 5V USB charger.

All these options work perfectly. (just be warned, none of these are delivered with your BBAI, you need to source them separately or just use what you already have).

 

 

Physical connection

If you connect it to the PC, you need one of the following:

 

- usb-c male - male cable at both ends if you have an usb-c slot on your computer/laptop (this is what I used)

- an usb-c male to usb-a male cable  if you only  have usb-a (classic USB) slots on your computer/laptop

 

Upon connection it shows up as an USB drive, just like Adafruit's circuit python boards do.

 

imageimageimage

On the drive, you have an HTML startpage that gives you the basic information about how to connect to the BBAI, directly and immediately, without any other special requirement.

 

Wifi & Console access

Via a predefined IP address, you connect via your favourite internet browser (I use Chrome) to its Cloud9 console/file manager interface.

For that, you can use the already made connection via USB-C or you can connect to the  BBAI's ALREADY CONFIGURED wireless access point in order to set up its wifi connection to your local wifi network.

The instructions for setting up the BBAI's wireless connection are simple and available right in that window, you just have to replace the name of your own network and copy-paste the rest!!!.

 

imageimageimage

 

You can also easily  write a configuration file to save the BBAI's wireless connection info, also with copy-paste commands, so that you won't have to enter it again.

 

And... that's all! You're ready to go! Just 3 steps really.

 

Well, optionally you can run the traditional 'sudo apt-get update' command to check for updates and its sibling 'sudo apt-get upgrade' command to update the system software. It's what I did in the screens above.

 

And then start installing your required packages for the machine learning projects that you have in mind.

 

Part II: The heating issues

 

...THIS THING IS HOT!

Not only is it hot as in 'hottest tech of the year', it is also getting very hot just with the provided passive cooling, just in light usage, for example as I did all of the above plus getting its system software up-to-date, at the end of it I couldn't place my finger on the passive cooling heatsink. Fortunately, Randall was kind to add a 5V F251R-05LLC brushless fan cooler to the package, but I had no long/tall M3 screws with me to secure it on the board  so I had to get some.

 

Next day, I've tamed the flaming beast! image With the cooler attatched to the heatsink, see the new close-up below, the temperature dropped from the 64 degrees celsius yesterday, to 50-51 today, in low/light usage.  The cooler is connected to the pins 1 and 5 (GND, black wire and 5V, red wire) on the P9 header of the Beaglebone AI and the command for reading the processor's temperature is 'cat /sys/devices/virtual/thermal/thermal_zone0/temp'.

 

imageimageimage

 

Well, since active cooling looks like a permanent requirement on this board, I figured that I need to make the cooler connection more stable (I will be carrying the board around with me)  and also removable, if necessary. Therefore, since the BBAI has female 2.5 pins, I took out from the spares stash a single-row of male pins and 'MacGyver-ed' a connector out of a piece of 3 pins (pulled out the middle pin since it wont be used for this connection) and soldered the ends of the cooler's cable to the two remaining pins. I taped it all together and behold the result:

 

image

 

You can get even lower temps while using it if you try cooling it from both sides, as you can see in my dedicated blogpost about getting the lowest possible temperature with my resources on the BBAI. I'll revisit that and this review when I find better options.

 

As a recap, what I did was placing the BBAI on a laptop cooler and the light usage/idle results were:


1. 61,8 - 64,2 degrees Celsius with just the supplied heatsink, passive cooling

2. 50,6 - 53,4 degrees Celsius  with the 5V F251R-05LLC brushless fan cooler attached

3.  47,8 - 50,6 degrees Celsius  with both the 5V F251R-05LLC brushless fan cooler attached and the board placed on the laptop cooler.

 

Later update: there's even a better script to help you watch the temperature of all parts of the BBAI SoC, already installed, you just need to call it like this in the terminal:

watch /opt/scripts/device/x15/test_thermal.sh

Its output is more detailed and you can leave a terminal window open with this to keep an eye on it while you're doing stuff.

image

 

Enclosure update: Thanks to who had a few spares (see his article here: BeagleBone AI (BB-AI) - Getting Started )and was very kind to send them over, my BBAI now is housed in a white-plastic enclosure, which I'll try to use as a base for a dual-active cooling solution. Yes, I still haven't found shorter screws yet!:)

 

imageimageimage

 

Cooling update:

 

I've finally achieved maximum cooling with this board, let me tell you how I did it. Now I really feel  I can try the machine-learning features without any worry of overheating and/or unexpected halts because of that.

 

Based on 's enclosure, I added another tall floor under the board, where I fitted a 12V cooler powered independently from the board. I also added passive cooling to all the black chips on the top side (RAM memory, eMMC memory and the TPS659037 power management chip) and to the one on the underside (second RAM memory chip) with the help of some microporous heatsinks and now the board stays below 40 degrees in regular usage.

 

Here are the pix!

imageimage

And the test results:

 

WITH underside cooling:

image

 

WITHOUT underside cooling (I took the cooler off the 'contraption'):

 

image

 

Now onto some real benchmarks!

 

Part 3: Market comparison of AI-on-the-edge products

 

There are currently the following AI SBC boards and modules options on the market, if you struggle to do an apples-to-apples comparison.

 

Beaglebone AI - around 100 euro

Raspberry PI (62,5 euro) + Google Coral (72 euro) - 134,5 euro in total

Raspberry PI (62,5 euro) + intel neural stick (72 euro)  - 134,5 euro in total

nVidia Jetson nano SBC (115 euro)

 

As you can see in the table below, there are only 2 integrated options and Beaglebone AI has the lowest price of all 4 setups.

 

Comparison table

 

Feats - BoardsBeaglebone AI

Raspberry Pi 4 4Gb + Google Coral

Raspberry 4 4Gb + intel neural sticknVidia Jetson nano
Main CPU

Dual Arm® Cortex®-A15 @1.5 GHz

2x dual Arm® Cortex®-M4 co-processors

2x dual-core Programmable Real-Time Unit

and Industrial Communication SubSystem (PRU-ICSS)

Broadcom BCM2711, quad-core Cortex-A72

(ARM v8) 64-bit SoC @ 1.5GHz

Broadcom BCM2711, quad-core Cortex-A72

(ARM v8) 64-bit SoC @ 1.5GHz

Quad-core ARM A57 @ 1.43 GHz

Memory

1GB DDR3 SDRAM4GB LPDDR4-3200 SDRAM4GB LPDDR4-3200 SDRAM4 GB 64-bit LPDDR4
GPU

Dual-Core PowerVR® SGX544 3D GPU

Vivante® GC320 2D GPU,

2D-graphics accelerator (BB2D) subsystem

Broadcom VideoCore VIBroadcom VideoCore VI128-core Maxwell
Storage

microSD,

16GB eMMC(with Debian and Cloud9 IDE installed),

USB drives

microSD,  USB drivesmicroSD,  USB drivesmicroSD,  USB drives, M.2 drives
USB1x USB 2.0 T-A, 1x USB 3.0 T-C2x USB 2.0 T-A, 2x USB 3.0 T-A2x USB 2.0 T-A, 2x USB 3.0 T-A4x USB 3.0 T-A, USB 2.0 Micro-B
GPIO pins nr46404040
GPIO pins typeanalog and digitaldigital onlydigital onlydigital only
ConnectivityI2C, SPI, CAN bus,UARTI2C, I2S, SPI, UARTI2C, I2S, SPI, UARTI2C, I2S, SPI, UART, M.2
Video ports1x uHDMI

2x uHDMI, 2-lane MIPI DSI display port,

2-lane MIPI CSI camera port,

4-pole stereo audio and composite video port

2x uHDMI, 2-lane MIPI DSI display port,

2-lane MIPI CSI camera port,

4-pole stereo audio and composite video port

1x HDMI , 1x Display Port,

2x MIPI CSI-2 DPHY lanes

Video output

HDMI 1.4A / DVI 1.0 1080p60

4k @ 15fps H364 video encode/decode,

Others @ 1080p60

4k @ 60 fps, H.265 (4kp60 decode),

H264 (1080p60 decode, 1080p30 encode)

4k @ 60 fps, H.265 (4kp60 decode),

H264 (1080p60 decode, 1080p30 encode)

Encoder 4K @ 30 | 4x 1080p @ 30 |

9x 720p @ 30 (H.264/H.265)

Decoder 4K @ 60 | 2x 4K @ 30 |

8x 1080p @ 30 | 18x 720p @ 30|

(H.264/H.265)

Ethernet

Gigabit EthernetGigabit EthernetGigabit EthernetGigabit Ethernet
Wireless

2.4GHz and 5GHz IEEE 802.11 b/g/n/ac WiFi,

Bluetooth 4.2

2.4 GHz and 5.0 GHz IEEE 802.11b/g/n/ac,

Bluetooth 5.0, BLE

2.4 GHz and 5.0 GHz IEEE 802.11b/g/n/ac,

Bluetooth 5.0, BLE

NONE
Internal AI HW

4x Embedded Vision Engines (EVEs)

2 C66x floating-point VLIW DSPs

NONENONE128 Maxwell cores
External AI HWNONE

Google Edge TPU ML accelerator coprocessor

Arm 32-bit Cortex-M0+ Microprocessor (MCU)

Up to 32 MHz max

16 KB Flash memory with ECC

2 KB RAM

Intel® Movidius™ Myriad™ X Vision Processing Unit

including 16 SHAVE VLIW DSP programmable processors

NONE

Supported

ML Frameworks

Optimized:

PyTorch, Caffe*, TensorFlow*

*Only certain models are supported. There are constraints on supported layers that must be met.

Un-optimized: OpenCV and alternative Python packages

Tensorflow Lite

TensorFlow*, Caffe*, Apache MXNet*,

Open Neural Network Exchange (ONNX*), PyTorch*,

and PaddlePaddle* via an ONNX conversion

TensorFlow, PyTorch,

MxNet, Keras, Caffe,

PaddlePaddle,

OpenCV and alternative Python packages

OSDebian linux distributionDebian linux distributionDebian linux distributionUbuntu variation(L4T)
Total price(€)100

Raspberry PI (62,5 )

Google Coral (72 )

16Gb MicroSD card(10 )

TOTAL 144,5

Raspberry PI (62,5 )

intel neural stick (72 )

16Gb MicroSD card(10 )

TOTAL 144,5

Board (115€)

16Gb MicroSD card(10 )

TOTAL 125

 

Market comparison comments

 

Now, you can see by looking closely at this each of these has its own advantages and disadvantages.

 

For example, BBAI is the only SBC in this set that has analog in pins, which means it can also be directly connected to analog sensors, while the others need supplementary hardware to achieve that (you basically need an ADC - see here) or you need to use digital sensors. It also comes with the OS, some of the ML libraries and the Cloud 9 IDE pre-installed on the 16 Gb eMMC memory, which cuts the need for a microSD card (although, for special purposes you can put one, it has a dedicated slot for it, just like all the other boards in the table above). The zero-install setup can help with quicker, unsupervised deployment.

 

The rest of the bunch (Raspberry Pi + either USB Accelerator and the nVidia Jetson Nano) are a little bit snappier in desktop usage of the SBCs, so if your intended use involves a lot of desktop usage on the SBC, especially from non-technical users that need GUIs, this is an area to be aware of. If you only connect remotely via Wifi to it, and command it via command-line,  the BBAI works just fine. Just don't try compiling any package on the SBC itself, in my experience this hasn't worked, it's a slow process and it can get stuck. Get already compiled packages or do it on more powerful hardware like destop computers.

 

If you really need to use 4k/1080p video decoding/encoding in your use case, that is, if you really cant settle for a lower resolution in production environments, BBAI has a slight disadvantage, since it's a bit slower at this then the other 3 setups, but:

 

1. it's a rare case/scenario

2. you can use a camera that does the encoding itself, releasing the cpu from this burden.

 

I'm not sure if there are video decoding/encoding packages that make use of the special BBAI hardware, it could be that using those could level up this area.

 

There have been some discussion online about BBAI and its SoC not using the latest processor models availble, but my guess, all things considered, is that the main goal was making the AI tech more accesible and it achieved that by being an all-in-one device that has the lowest price so far among comparable options.

 

Last but not least, you have to check if your desired machine learning models and frameworks are fully compatible with the device you decide upon and this is an area I want to focus on for completing this RoadTest Review. At first glance, it may look like two of the more expensive ones (nVidia Jetson and RPi 4 + intel neural stick) have an advantage by offering the most options to choose from, BUT:

 

1.it really all depends on your usecase

2. you need to see them in action to decide how well they're effectively supported and optimized on the specific AI hardware.

 

Part 4: Machine-learning libraries on BBAI: unoptimised vs optimised

 

There are many machine learning libraries, especially for what is called 'computer vision'. I can't define CV better than Wikipedians:

Computer vision is an interdisciplinary scientific field that deals with how computers can be made to gain high-level understanding from digital images or videos. From the perspective of engineering, it seeks to automate tasks that the human visual system can do (https://en.wikipedia.org/wiki/Computer_vision)

You can basically use them on all operating systems for which they have compiled binaries for.  The main programming language used in machine learning is Python (www.python.org) which is pretty easy to learn, with many free resources available. Some libraries use other languages like C++, but we'll focus on the Python ones in this RoadTest review.

 

One of the main advantages of the open-source Python language is that it's based on a modular structure with a large amount of libraries that can dramatically simplify the needed workload for a project. You can search for the libraries(the main repository is The Python Package Index (PyPI) ), study their documentation and usage, install them(with pip package_name ), and you're ready to use them in your code, where you only need to integrate the specific details of your project and use the packages you selected to create and run the project. There are over 9000 packages referring to 'computer vision' on pypi.org when I'm writing this, and in the future there will probably be more (this includes outdated packages that arent used anymore, this also functions as an archive, but you can select by relevance / date last updated/ trending (by downloads).

 

As I've mentioned in the comparison chart, here are some of the most popular modern machine learning frameworks that support the Python language: PyTorch, Caffe, Tensorflow, OpenCV, Scikit-learn. Each of the dedicated machine learning hardware in our comparison is optimised for some of these, that is, it has the necessary libraries and hooks needed to run the framework's ML workload on the dedicated hardware's architecture. It's like the operating system concept, or like a dedicated set of drivers, but for the ML part. And this is where things get complicated, since these dedicated ML hardware architectures aren't as simple as, let's say a sensor for which you'd need to build a dedicated python library, and also the ML background operations are pretty complicated themselves.

 

 

 

REVIEW STILL PERMANENTLY UPDATED - STAY CLOSE/ SUBSCRIBE AND ASK ANYTHING YOU WANT TO KNOW IN THE COMMENTS!

I've started documenting the journey on a little series of dedicated blog posts:.

 

1. Adventures in electronics, linux and data science with the Beaglebone AI (Roadtest review side blog)

 

2. Lowest idle cpu temperature so far on the BeagleBone AI SBC! how-to and pictures inside

Anonymous