Hello element14 World
Hello my name is Jan but I am also known by my alter ego "Basement Engineering". I do a lot of electronic related projects and love to share my ideas, especially with young people. That's why once a year, I pack up all of my favorite projects and head to the Maker Faire in Hannover, Germany, where I present them to thousands of people and let them try out my inventions. This year someone from element14, unfortunately I forgot his name, approached me and handed me an MT3620 starter kit. I like new tech and cool packaging, especially if it's free, but I wasn't really sure if I would be able to get it working. I usually work with very simple hardware a la Arduino and Raspberry Pi. Then he told me about the Sensing the World Challenge and all of a sudden, I was eager to try it. Well fast forward a couple of months and here we are. Im writing my first blog post on the element14 community about an interesting cloud project that I actually got to work. I usually post my projects on instructables and YouTube, so this blog format is kind of new to me but I hope I can still give you a good insight into the project and spread some inspiration.
Anyways lets hop into it.
What we are going to talk about today
A project, that I am going to tell you about today, has been a long running student project. The aim is to build a miniaturized monitoring boat to monitor blue-green algae aka cyanobycteria. This has been a group project at our university, initiated and built by a group of 5 students, including me. The project has officially been finished in September, with a working boat (more on that below). However, there was still a major component that hadn't been finished, the measuring assembly.Therefore my colleague Ammar Sadat and I initiated a secondary mini project, that aims to get the control electronics of the measuring assembly working, while also getting to know the new Azure Sphere Hardware.
In this blog post I am mainly going to focus on my part of the project. I'm going to refer to the boat as the "parent project" and tell you a little bit about it to get you up to speed. Than we are going to dive into today's project and take a look at the hardware and software. Furthermore I am going to give you an insight into my journey with the Azure Sphere.
This blog post is not meant to be a tutorial on how to use the Azure Sphere as I have only scratched the surface of the Azure ecosystem and am a novice in regards to it. The goal of this blog post is to tell you about an interesting project that involves an Azure Sphere MCU. And, let's be honest, I am also interested in taking part in part two of the "Sensing the world Challenge" therefore this is my contest entry.
For the sake of reading enjoyability, I am going to write in a more informal style.
Here is a table of contents to give you a more structured overview of this post.
The Parent Project
Our Little Boat
Lets first take a look at the boat from the headline. In the picture below you can see our big yellow Katamaran, the "Kenterprise" (kentern is German for capsizing, that's how much we believed in its floating capabilities. Luckily it worked out). The Kenterprise is actually not that big, with a length of only 1 m and a width of 80 cm. It is made of Styrofoam, wood, fiberglass and a lot of sweat by 4 of my university colleagues and me. It is meant to monitor the nasty stuff two pictures below, "Cyanobacteria". The boat is mostly done: it features two electric motors, a big lithium ion battery, made from 40 18650 cells, and can be controlled with a remote. It is basically a supersized RC-boat that can carry a fairly heavy payload. This payload is the measuring assembly, which is the focus of this project and not yet functional.
Cyanobacteria
Cyanobacteria are a little bit like most of us, they love sunlight and warm waters. Unfortunately they are not that good at coexisting with the local ecosystem. They often appear on flat lakes and form an algae slick that gives of an unpleasant smell. The smell is not the only thing that's bad about them, they also produce toxins that are released into the water. Those toxins can cause diarrhea, skin rashes and respiratory diseases in humans. For animals it is even worse: Smaller animals,such as dogs, that drink the contaminated water, can die from the toxins. Once the cyano-colonies themselves die, the decomposition of the biomass causes the waters oxygen levels to plummet, which can kill a lot of fish.
I guess we can all agree, that it might be useful to know, if there is cyanobacteria building up on the water and what the current situation of the ecosystem actually looks like. That's why we set out to develop a system that would enable just that.
The Monitoring Idea
The basic idea of the system is to have a vehicle, the Kenterprise, that transports a measuring assembly to multiple measuring points on the water surface. Ones a measurement has been taken, the results are uploaded to a cloud application, where they are accumulated and saved to a database. In the final step the data is visualized and presented to the public in some kind of web application.
The Measuring Assembly
The measuring assembly is the heart of the operation. It consists of 3 smart sensors that measure dissolved oxygen, pH, temperature and the cyano-concentration (by utilizing their fluorescent properties). The sensors run on 12V and offer a Modbus RTU interface.They are each mounted inside a so called "Flow Cell" where the water passes them. Before the water enters the Flow Cells it has to go through a Debubbler. In there the sediment can sink to the bottom, while any air bubbles can move upwards and leave the system through an extra air line. A secondary Pump then proceeds to move the water through the 3 Flow Cells, which are connected in series. The Flow Cells, the Debubbler and the pumps are mounted inside of a wooden frame. By the way, because of our limited project budget, we developed and 3D printed our own Flow Cells and Debubbler, which worked out quite well but they are not quite water tight yet. So far we still haven't gotten to the actual project of this blog, but bare with me we are almost there. On the back of the frame is a gray box that contains all of the electronics. These are not yet working, mostly because we had issues with broken RS485 modules and because we were unable to finish the code in time for the big boat project. Let's take a closer look at the electronics.
A Closer Look At The Electronics
Finally, something electronics related, the control electronics of the measuring assembly. Previously the control electronics consisted of an Arduino Nano that was connected to a bunch of MOSFET's to control the pumps, a MAX485 UART to RS485 Converter, that allows it to talk the sensors and a bunch of modules to get the the time and the GPS position and write it to an SD card. We called this assembly the Data-Logger, although it also controls the pumps.
Whats new in the Overview schematic is the Azure Sphere MT3620 MCU. It is loosely coupled to the Arduino and is supposed to connect the Data-Logger to the Microsoft Azure Cloud. That is why I also refer to it as the "Cloud-Link". The Cloud-Link doesn't do too much work, considering it's performance capabilities. It could in theory replace the Arduino completely. However, as I was not sure, if I would be able to make it work at all, I decided to loosely couple it to the existing system via UART.
Today's Project
So this project aims to get the control electronics up and running and send sensor data to the cloud. To be more specific it consists of 2 parts:
- Implementing the Data-Logger
- getting Modbus working
- writing a proper firmware with a simple state machine
- Implementing the Cloud-Link
- getting started with the MT3620 kit
- writing a C application to send data from the UART to the cloud
- creating a cloud Application
Furthermore I wanted to find out if an Azure Sphere Microcontroler is the right choice for such a "one of a kind"-project or if another solution would suit it better.
We'll take a look at that in the conclusion.
As I mentioned in the beginning, I worked together with my colleague Ammar. He worked on getting the hardware of the Data-Logger working and developed the state diagram for it. I on the other hand was focused on the Cloud-Link and wrote the firmware for the Data-Logger. I am mainly going to focus on my part of the project, especially on the Cloud-Link.
The Data-Logger
The Data-Loggers state machine is based on the measuring assemblies pumping scheme (left line diagram). Let me give you a quick rundown of the pumping action. Onc the measuring assembly receives a trigger signal [1] it starts filling the Debubbler with water from the lake by turning on pump 1. Based on timers, it decides whether or not the Debubbler is full. [2] Once it is full, pump 2 is turned on to fill the Flow Cells. Note that pump 1 keeps running for some time. That is due to the fact that the Debubblers maximum volume is smaller than the combined volume of the 3 Flow Cells, therefore we need to refill the Debubbler for a little bit to prevent the assembly from running dry. After we have enough water [3] pump 1 is finally shut off and pump 2 keeps running, until the water in each Flow Cell has been replaced with the new probe. [4] Then the sensor measurements are triggered. In this step the Arduino also retrieves the GPS position, as well as the current time, from the GPS module. Even if there are any sensor errors the system returns to the idling state [0] and waits for the next trigger.
This state diagram has been compiled into a big switch statement, that can be seen here. I've got to admit, this is not my nicest piece of code but it serves it's purpose of managing the 5 available states. The source code can also be found in the attachments. It was written using the Atom IDE with the Platform IO plugin installed. That is the reason you will find a main.cpp instead of the usual .ino file.
void updateStateMachine(){ switch (currentState){ case IDLING: if(triggerDetected()){ pumpManager.fillDebubbler(true); currentState = FILLING_DEBUBBLER; #ifdef DEBUG Serial.println("NEW STATE: FILLING_DEBUBBLER"); #endif } break; case FILLING_DEBUBBLER: if(pumpManager.debubblerFull()){ pumpManager.fillFlowCells(true); currentState = FULLY_PUMPING; #ifdef DEBUG Serial.println("NEW STATE: FULLY_PUMPING"); #endif } break; case FULLY_PUMPING: if(pumpManager.enoughWater()){ pumpManager.fillDebubbler(false); currentState = EMPTYING_DEBUBBLER; #ifdef DEBUG Serial.println("NEW STATE: EMPTYING_DEBUBBLER"); #endif } break; case EMPTYING_DEBUBBLER: if(pumpManager.flowCellsFull()){ pumpManager.fillFlowCells(false); currentState = MEASURING; #ifdef DEBUG Serial.println("NEW STATE: MEASURING"); #endif } break; case MEASURING: bool gpsWorking = updateGps() && getGpsData(); bool sensorWorking = getSensorData(); if(gpsWorking){ #ifdef DEBUG Serial.println("measurements successfull"); #endif updateCloudLink(); //sdManager.logData(ts,lat,lon,temp,ph); } else{ #ifdef DEBUG Serial.println("ERROR: sensor or gps not working"); #endif } digitalWrite(LED_PIN,LOW); currentState = IDLING; #ifdef DEBUG Serial.println("NEW STATE: IDLING"); #endif break; } }
A Lesson On Hardware
As you can see in line 47 the function sdManager.logData(..) is commented out. During our testing and debugging, we noticed that the 2kBytes RAM, that the Arduino Nano offers, are not enough to do a lot of string-based data-formatting, put out debugging messages and manage an SD card. Therefore we ditched the SD logging, as it requires a buffer of 512kBytes. So there is a pro tip for all of the microcontroller-newcomers out there: Plan for enough resources if your MCU does multiple Tasks and watch out for your compilers RAM usage estimation.
The hardware, mainly the RS485 interface module, caused us a lot of headaches that's why i'm very thankful that Ammar got it working. A pro tip when debugging an RS485 module or anything UART related: Put LED's onto the RX and TX lines, so you can see when a message gets send. This way you can quickly determine, if you have to switch your RX and TX or if something is completely broken.
Output
The Logger accumulates all of the data from the sensors as well as the GPS module. It then converts it into .csv to save it to the SD card and into a JSON-string that is outputted over UART. This string already has the proper format for an Azure IoT-Hub application and looks like this.
{"deviceId":"Measuring Assembly", "ts" :1574204250,"lon":-42.424242,"lat":42.424242,"temp":22.34,"ph": 12.23}
It is generated by the printJsonToStream(...) function. The print function is repeatedly used to prevent any buffers from getting to big and breaking strings. It worked for us but is a pretty dirty workaround for our lack of RAM.
void printJsonToStream(Stream& stream){ String comma(","); stream.print( String("{") ); stream.print( "\"deviceId\": \"Measuring Assembly\"" + comma ); stream.print( "\"ts\":" + String(ts) + comma ); stream.print( "\"lon\":" + String(lat,6) + comma ); stream.print( "\"lat\":" + String(lon,6) + comma ); stream.print( "\"temp\":"+ String(temp,2)+ comma ); stream.print( "\"ph\":" + String(ph,2) ); stream.print( String("}") ); }
The Cloud-Link
All right, now we know how the Data-Logger works. Let's take a look at the other side of the UART connection, the Cloud-Link. As mentioned earlier the Cloud-Link is the actual Azure Sphere MCU. It is loosely coupled to the rest of the system and receives data in formatted JSON-strings, therefore it doesn't have to do much more than reading the string from the UART and sending it to an Azure IoT Central Application.
Although the task is pretty simple, I found it particularly tricky to get started with the Azure Sphere. On my journey from a total Azure Sphere beginner to a total Azure Sphere beginner that has implemented one project with an MT3620, I learned a thing or two. I'd like to use this section to briefly tell you about the steps I took to get the system up and running. I'm not going to go into too much detail, as there are different sources online, that are much better at explaining how to get started. However i will try to save you some potential hassle with your Azure Sphere implementations by mentioning my findings.
Pre-"Hello World"
Getting started with an Azure Sphere requires a couple of extra steps compared to other MCU's such as Arduino. Well, to be fair, you can't easily manage thousands of Arduinos, while you can in fact easily manage thousands of Azure Sphere MCU's using the Azure Cloud. This awesome capability comes at the price of some overhead and hurdles for first time users.
Let's first of all talk about the software. To program the board you are going to need Visual Studio 2017 which you can get here . You can also use Visual Studio 2019, however I ran into a bug where the Assistant for creating a connected application wasn't working in the latest version, hence I recommend using VS 2017. After you installed that, you need to install the Azure Sphere SDK (downloadable here). You might also need to install a USB driver.
From a software side you are now good to go. Now you have to claim your device. The claiming process attaches the device to a tenant through which users can manage it. I found it to be a little bit tricky to wrap my head around the tenant concept, but Adam Saxton does a great job at explaining it here. Once this bond is formed it is irreversible. So be sure to connect your board to the right tenant.
The tenant exist inside an Azure Cloud organization. Neither me, nor my university had an Azure Cloud Account, therefore I had to get myself one. Microsoft offers a free account here. Creating it requires a credit card but doesn't cost you anything as long as you don't use too many resources. You also get a bunch of free services as well as some starting credit to experiment with different resources. The first thing I did was to use the Azure Active directory, which is Microsofts user management tool, to create a secondary account and assigned it the rights to develop applications (such as Azure Sphere MCU code).
After the account creation you can head back to the Azure Sphere Developer Command Prompt on your PC and follow Microsofts Guide to claim your board. After the claiming is done, you set up a WiFi connection, put the board into debugging mode and install the latest OS update by following Microsoft's Guides. With all of that done, you should be able to upload your first application.
This also enables you to use the Contest Registration App and ping the Microsoft Server. I faced an issue, where the app would tell me that my WiFi was not configured. After searching through some forums, i was able to resolve it by adding a delay after the restart command in the script file "sk_check.bat".
azsphere device restart echo. TIMEOUT /T 5
Getting the Examples running
For me the next step was to get an example application up and running, preferably one that shows me how to use the UART. To properly set up an example application, I installed the Github plugin for visual studio. Through this plugin I then downloaded the Azure Sphere examples from this repo. After I let my LED's blink for a while and saw some UART examples in the debug console, I was happy and went to bed.
The next day I looked for a more advanced guide to follow through and found Brian Willes great Demo Project "Avnet's Azure Sphere Starter-Kit (Out of Box Demo)". A 3-part
guide that made getting started with the Avnet development board a whole lot easier. The guide walked me through running some demo Code, connecting to the cloud and building an IoT Hub as well as an IoT Central application.
My actual Code
For my application code I only wrote a couple of lines myself. The rest of the firmware mainly consists of Brian Willess demo code. His example code can be found in this repository. I simply threw out most of the stuff I didn't need and puzzled in some code from the UART example.
Below you can see the only bit of code that I had to write, my own UART event handler.
static void UartEventHandler(EventData *eventData) { size_t bufferSize = 256; uint8_t receiveBuffer[bufferSize + 1]; ssize_t newBytes = read(uartFd, receiveBuffer, bufferSize); bool validRead = (newBytes > 0); bool error = (newBytes < 0); bool enoughSpace = !((bytesRead + newBytes) > RECEIVE_BUFFER_SIZE); if (validRead) { if ( enoughSpace ) { for (int i = 0; i < newBytes; i++) { messageBuffer[i + bytesRead] = receiveBuffer[i]; } bytesRead += newBytes; for (int i = 0; i < bytesRead; i++) { if (messageBuffer[i] == 125) { // ASCII 125 = } Log_Debug("Full String Received\n"); Log_Debug("UART received %d bytes: '%s'.\n", i, (char *)messageBuffer); messageBuffer[i + 1] = 0; //String Termination AzureIoT_SendMessage(messageBuffer); bytesRead = 0; return; } } } else{ Log_Debug("BUFFER OVERFLOW!!!\n"); bytesRead = 0; } } else if(error){ Log_Debug("New Bytes: %d).\n", newBytes); Log_Debug("ERROR: Could not read UART: %s (%d).\n", strerror(errno), errno); terminationRequired = true; bytesRead = 0; } }
I'm going to be honest with you, this is definitely not one of my proudest pieces of code. Let me explain how it works. Whenever there are new characters to be read from the UART buffer my function gets called. It writes them into the temporary receiveBuffer. We are going to assume that we have enough space and there were no errors. Then this buffer is copied to the end of the global messageBuffer inside the first for-loop. The messageBuffer also keeps all characters that have been received in previous calls of UartEventHandler(). Then it loops through the whole buffer, inside the second for-loop, and looks for the end of my JSON-string, which is a closing curly bracket This of course only works because the JSON-string from the Data-Logger is very simple and does not include any nested objects. Therefore I don't need any JSON-parser. If it finds the character it is looking for, it adds a string termination character into the messageBuffer right behind the curly bracket. Then it passes the complete message to the Azure IoT utility function AzureIoT_SendMessage().
That's it, from here the message automagically gets sent to my Azure IoT Central application.
The IoT Central Application
IoT Central offers a simple no-code alternative to creating a custom web app. I have also never created a custom web app before, that's why I went with IoT Central to visualize my data. Creating the IoT Central application was refreshingly simple. I followed the third part of Brian Willess Guide and created my own device template and dashboard that shows the GPS's latitude and longitude as well as the sensor values for pH and temperature.
Connection String Issues
To Connect my board to my web application, I chose to use the connection string method. Choosing is an exaggeration, I simply followed Brian Willess Guide. I installed the dps-keygen utility. I copied my ID's and my key from my IoT Central application and pasted them into the command below, hit enter aaaaaaaand it didn't work. I got an error, repeated the process a couple of times, triple checked everything and started googling. Turns out that the recommended way of entering the command, with a : to separate the parameter names from their values does not work inside the Windows PowerShell. However, if you use a different character such as an X it works as expected. Here you can see the full command that worked for me.
dps−keygen −siX < scope ID > −diX < device ID > −dkX< device primary shared key >
The result of this key, was a connection string that I then pasted into my source code. I hit compile and saw some data appear in my IoT Central application.
The Result
The first picture shows my breadboard-build. On the left you can see the Data-Logger with the Arduino Nano, an M12 Connector for the sensor and a bunch of wires. Three of those go over to the Avnet Azure Sphere Board. Those are GND, SoftRX and SoftTX. The TX coming from the Arduino is reduced to 3.3V by a simple voltage divider, in order to reach the correct logic level for the Azure Sphere Board. The second image shows the pH sensor. To test it I put it into a beer glass filled with cold water.
In the future the boat is supposed to have it's own WiFi hotspot, as of right now I am using my smartphone as a hotspot for the Azure Sphere Board to connect to the internet.
Furthermore you can the that the USB cable is still plugged into the Azure Sphere board. As I didn't have time to find out how to move from debugging mode to a proper OTA deployment.
I left the setup running for an hour with the sensor connected and we got two line graphs for pH and temperature. I didn't grab a screenshot but below you can see the dashboard with simulated data.
My Conclusion
All right, I actually managed to implement my first project with an Azure Sphere MCU and it is also a useful one. I have to admit, that it was a pretty tricky journey, with a couple of long evenings in front of my PC, but i'm glad that I spent the time doing it. I learned a lot and was able to get from a general understanding of the cloud, to getting to know Azure and the Sphere ecosystem. I gained some knowledge and a new perspective on microcontroller development in general, which is a win in my book.
The Hardware
I am also pretty fascinated by the MT3620's. For this project I took dive into the datasheet and the functional diagram. At my university I had to take a closer look at several different MCU's in the past and the MT3620 definitely stands out. From the block diagram below you can see the multiple different subsystems that are actually hardware-separated. The cool thing is that they can work independently. Even if I mess up my application code, the Wi-Fi subsystem will most likely still keep on working and allow me to upload a fixed firmware. The Pluton subsystem offers secure boot, on a microcontroller. How cool is that?
All right I might be a little bit of a hardware geek, but if you take a look at a functional diagram of the ESP8266, which you can see in the second, more colorful image, you can clearly see the difference. By the way, I cut those pictures out of Mediateks and Espressifs respective datasheets.
The Sphere Ecosystem
I am by no means a security expert, but even I understand that keeping your security software up to date is crucial in IoT applications. I have done a couple of IoT projects with ESP MCU's and Raspberry Pi's in the past. They were always on my local network and I didn't implement any security measures. In more professional development environments, cyber security is of course a concern, but as rapid prototyping becomes the norm and development times shrink down, there is not always time to thoroughly test devices before they go into production and weakpoints can be baked into the products. Cheaper devices, such as the 10€ WiFi-Lamp in my room, also tend to not get any updates or not be updated.
Many people argue that this is not that much of a big deal. An LED light or a smart TV are not very safety relevant, as they can not really physically harm anyone like an autonomous car or an industrial machine could. I guess the Dyn attacks in 2016 proved, that those devices can indeed be dangerous, if they are coordinated by an evil entity such as the mirai botnet.
This is why I clearly understand the need for something like the Azure Sphere. I can not really say, if it is the best solution, as it is the only one that I have used so far. It definitely makes it a whole lot easier to quickly develop secure IoT devices, deploy them in huge numbers and make sure that they stay secure for years to come.
Azure Sphere And Unicorns
Now that I have talked about the cool and interesting features of the Azure Sphere, let's get back to one of the central questions that this project was supposed to answer: Is the Azure Sphere suitable for one of a kind applications, such as the Kenterprise.
Well let's first lay out what it is designed for. The Azure Sphere was created with large scale deployments in mind. One of the example application inside the IoT Central app is for a vending machine. It reports it's internal temperatures and possibly it's inventory (I only took a brief look at it), back to the responsible company. It is easy to imagine, that there might be a couple of thousands of those machines, scattered around a country. As they are located close to the customer and far away from the company, they are hard to get to. With the Azure Sphere, that is no problem. As long as there is an internet connection, updates can be transferred over the air to the device, errors can be reported and statuses can be checked. All of that can happen remotely.
The fact that the system was created for larger deployments doesn't exclude one of a kind applications, but it suggest that there might be other, more suitable solutions. Considering the large starting hurdle and the requirement for an Azure Cloud account, to which the hardware is irreversibly bound, I can not recommend the use of Azure Sphere hardware for my use case. A better solution might be a Raspberry Pi. It probably takes quite a lot of work to set it up and implement enough security measures, but it is way easier to find easy to understand documentation and it is much more open for experimentation.
I am guessing that this does not come as too much of a surprise for most of you. I already had a feeling that there was a slight mismatch, when I read through Microsofts website and the Azure Spheres capability of scaling up to millions of devices was mentioned. But I assume that working with the Sphere will become easier as it becomes more widely available. Maybe it will become as easy as working with an Arduino or a Raspberry Pi.
Thanks For Your Attention
I'd like to thank you very much for bearing with me and reading through my blog post. I hope I could share some inspiration and maybe even pass a bit of knowledge here and there.
I put quite some time into writing this and I am sure that there are a couple of errors. If you spotted anything, that is completely off (aside from my punctuation and spelling), please let me know, I will try to change it as soon as possible.
Top Comments