Intro
This blog is part of the Save the Bees Design Challenge. Normally LoRa is not considered to be suitable for transmitting images due to its low bandwidth, but I am pushing on that boundary in this project to explore its usefulness in applications where you only need one image per day. Specifically checking on beehive health isn't something that needs attention every day. The other major system in this project uses machine vision to recognize bees and insects to report on their activity, but this system actually sends real images from within a hive back to the base station. This reduces the need to check on the hive in person because a good image can provide a lot of information about hive health. Specifically one of the main concerns for bee keepers is Varroa Destructor mites:
And according to my bee keeper friend one of the best ways to detect them is with cameras. This part of the project explores the feasibility of using a camera located in a hive to remotely send images back to the bee keeper's home base. The Arduino MKR WAN 1300 has a great low power radio with excellent long range capability which I want to exploit to send an image back to base every day. The antennas I chose have been tested around my house to reach 3.5 km reliably, which should be good on a farm.
The Arduino does not have enough memory to capture a decent image in a frame buffer, so I am using an ArduCam Mini which has its own frame buffer and can send the image out as serial data on demand.
This is a block diagram of the remote camera system.
The Build
This first video shows how the system went together and what it physically looks like:
As you can see, this project involved a lot of 3D printing and it really helped when testing all the different aspects.
Incidentally, this is what the ArduCam Mini looks like:
These cameras are useful whenever you have a microcontroller that does not have the memory resources to deal with images. In this case there is a frame buffer in the camera and one in the final LCD and neither MKR MCU in the system needs to store image data, they are just the serial conduit between peripherals that have their own frame buffers. Th NIcla Vision can store its own images as well, so it could be used in this mode. If I wasn't so incapacitated I would have tried it. However It is a bit on the expensive side to use a Nicla Vision for this application. It is better suited to use its resources for machine vision which is the other big system in this project.
System Operational Demo
Transmitting images over LoRa is going to take time because there is limited bandwidth available. This video demonstrates sending images over LoRa:
Software
There was software involved in preliminary testing the ArduCam but it didn't require a lot of modification from the ArduCam examples. The main problem was lack of documentation - to code didn't have enough explanation of camera commands etc. Beyond this the main software development effort was more difficult because ArduCam did not have any MKR examples and what I was trying to do was quite unusual. I am including the firmware I used, even though it is a mess - not well documented and not robust at all. It doesn't do any error checking or retransmission. About all you can say about it is that it works.
Crystal Base Station Firmware
/* Crystal Base Station - for remote ArduCam Displays images sent by a remote ArduCam Uses an arduino MKR 1300 to receive LoRa data, display images Uses a XFS5152 or an SYN6988 to synthesize speech Uses a XFS5152 text-to-speech module to talk Uses an ILI9341 or an MSP2807 LCD to display color images by Doug Wong 2023 */ #include <SPI.h> #include <LoRa.h> #include <Adafruit_GFX.h> #include <Adafruit_ILI9341.h> #define TFT_CS 6 //chip select pin for LCD #define TFT_RST 4 //reset pin for LCD #define TFT_MISO 10 //MISO pin for LCD #define TFT_DC 7 //data / command select pin for LCD #define TFT_MOSI 8 //serial data input pin for LCD (MOSI) #define TFT_CLK 9 //SPI clock pin for LCD #define LCD_C LOW //this value select command mode on LCD DC #define LCD_D HIGH //this value select data mode on LCD DC const unsigned long SEND_INT = 5000; // Sends a msg on this interval (milliSec) char rsi; // receive signal strength int rssi; // recieve signal strength int packetSize; // LoRa recieved packet size int timeout; char LoRaIn; // character received via LoRa char LoRaInStr[2]; char LoRaChar; char PL; // pixel low byte char PH; // pixel high byte uint16_t cl; uint16_t ch; String rsis; char rsia[4]; uint16_t pixel; //R5G6B5 pixel color uint16_t x; uint16_t y; uint16_t xd; uint16_t yd; char LoRaString[1]; byte synback; bool news; //new sighting flag byte critr; //critter number char crits[2]; //critter number as a charchar LoRaChar; String critss; Adafruit_ILI9341 tft = Adafruit_ILI9341(TFT_CS, TFT_DC, TFT_MOSI, TFT_CLK, TFT_RST, TFT_MISO); void speak(char* msg) { Serial1.write(0xFD); //Start Byte Serial1.write((byte)0x0); //length of message string - high byte Serial1.write(2 + strlen(msg)); //length of message - low byte Serial1.write(0x01); //command 01 = speak message Serial1.write((byte)0x0); //text encoding format 0-3 Serial1.write(msg); //message to speak } void waitForSpeech(unsigned long timeout = 60000) { unsigned long start = millis(); bool done = false; while ( ! done && (millis() - start) < timeout ) { while ( Serial1.available() ) { if ( Serial1.read() == 0x4F ) { done = true; break; } } } } void setup() { Serial1.begin(9600); // this serial port is used for controlling speech tft.begin(); tft.setRotation(3); // landscape tft.fillScreen(ILI9341_WHITE); tft.setTextColor(ILI9341_RED); tft.setTextSize(3); tft.setCursor(50,5); tft.print(" Crystal "); tft.setCursor(50,40); tft.print(" LoRa "); tft.setCursor(50,80); tft.print(" for "); tft.setCursor(50,120); tft.print("Remote ArduCam"); tft.setCursor(50,160); tft.print(" by "); tft.setCursor(50,200); tft.print(" DOUG WONG "); delay(4000); // show the splash screen for 1 second tft.fillScreen(ILI9341_BLUE); tft.setTextColor(ILI9341_WHITE); if (!LoRa.begin(915E6)) { //start LoRa radio for North America tft.print("LoRa fail"); while (1); } news = 1; critr = 1; //3=Lora, 1=B1, 2=2B, 4=Husky, 5=Spiderman, 6=Cindy, 7=Grashopper, 8=Ladybug, 9=background } void loop() // main loop has a conversation { char buf[128]; // for text-to-speech text for (y = 0; y < 241; y++){ for (x = 0; x < 321; x++) { while (!packetSize) { packetSize = LoRa.parsePacket(); } packetSize = 0; // read packet PL = (char)LoRa.read(); PH = (char)LoRa.read(); pixel = 256*PH + PL; // recover R5G6B5 pixel xd=((x+160) % 320); yd=y; tft.drawPixel(xd, yd, pixel); } } }
Remote LoRa ArduCam Firmware
// This program takes images uing an ArduCAM_Mini_2MP and transmits them over LoRa using an Arduino MKR 1300
// Doug Wong 2023
//
#include <Wire.h>
#include <ArduCAM.h>
#include <SPI.h>
#include <LoRa.h>
#include "memorysaver.h"
#if !(defined OV2640_MINI_2MP)
#error Please select the hardware platform and camera module in the ../libraries/ArduCAM/memorysaver.h file
#endif
// set pin 5 as the slave select for the camera:
const int CS = 5;
bool is_header = false;
int mode = 0;
uint8_t start_capture = 0;
#if defined (OV2640_MINI_2MP)
ArduCAM myCAM( OV2640, CS );
#else
ArduCAM myCAM( OV5642, CS );
#endif
uint8_t read_fifo_burst(ArduCAM myCAM);
void setup() {
// setup code here - run once:
uint8_t vid, pid;
uint8_t temp;
Wire.begin();
Serial.begin(9600);
// set the CS as an output:
pinMode(CS, OUTPUT);
digitalWrite(CS, HIGH);
// initialize SPI:
SPI.begin();
//Reset the ArduCam CPLD
myCAM.write_reg(0x07, 0x80);
delay(100);
myCAM.write_reg(0x07, 0x00);
delay(100);
temp = 0;
while(temp != 0x55){
//Check if the ArduCAM SPI bus is OK
digitalWrite(LED_BUILTIN, HIGH);
myCAM.write_reg(ARDUCHIP_TEST1, 0x55);
temp = myCAM.read_reg(ARDUCHIP_TEST1);
if (temp != 0x55){
digitalWrite(LED_BUILTIN, HIGH);
delay(400);continue;
}else{
delay(1000);
digitalWrite(LED_BUILTIN, LOW);
delay(400);
}
}
#if defined (OV2640_MINI_2MP)
while(1){
digitalWrite(LED_BUILTIN, HIGH);
//Check if the camera module type is OV2640
myCAM.wrSensorReg8_8(0xff, 0x01);
myCAM.rdSensorReg8_8(OV2640_CHIPID_HIGH, &vid);
myCAM.rdSensorReg8_8(OV2640_CHIPID_LOW, &pid);
if ((vid != 0x26 ) && (( pid != 0x41 ) || ( pid != 0x42 ))){
digitalWrite(LED_BUILTIN, LOW);
delay(1000);continue;
}
else{
delay(1000);
digitalWrite(LED_BUILTIN, LOW);
break;
}
}
#endif
//Change to JPEG capture mode and initialize the OV5642 module
myCAM.set_format(JPEG);
myCAM.InitCAM();
#if defined (OV2640_MINI_2MP)
myCAM.OV2640_set_JPEG_size(OV2640_320x240);
#else
myCAM.write_reg(ARDUCHIP_TIM, VSYNC_LEVEL_MASK); //VSYNC is active HIGH
myCAM.OV5642_set_JPEG_size(OV5642_320x240);
#endif
delay(1000);
myCAM.clear_fifo_flag();
#if !(defined (OV2640_MINI_2MP))
myCAM.write_reg(ARDUCHIP_FRAMES,0x00);
#endif
if (!LoRa.begin(915E6)) { //start LoRa radio
digitalWrite(LED_BUILTIN, HIGH);
while (1);
}
}
void loop() {
// main code - run repeatedly:
uint8_t temp = 0xff, temp_last = 0;
bool is_header = false;
digitalWrite(LED_BUILTIN, HIGH);
delay(10000);
digitalWrite(LED_BUILTIN, LOW);
delay(200);
// mode = 3;
temp = 0xff;
myCAM.set_format(BMP);
myCAM.InitCAM();
#if !(defined (OV2640_MINI_2MP))
myCAM.clear_bit(ARDUCHIP_TIM, VSYNC_LEVEL_MASK);
#endif
myCAM.wrSensorReg16_8(0x3818, 0x81);
myCAM.wrSensorReg16_8(0x3621, 0xA7);
digitalWrite(LED_BUILTIN, HIGH);
delay(100);
digitalWrite(LED_BUILTIN, LOW);
delay(100);
//Flush the FIFO
myCAM.flush_fifo();
myCAM.clear_fifo_flag();
//Start capture
myCAM.start_capture();
start_capture = 0;
digitalWrite(LED_BUILTIN, HIGH);
delay(100);
digitalWrite(LED_BUILTIN, LOW);
delay(100);
if (myCAM.get_bit(ARDUCHIP_TRIG, CAP_DONE_MASK))
{
digitalWrite(LED_BUILTIN, HIGH);
delay(50);
uint8_t temp, temp_last;
uint32_t length = 0;
length = myCAM.read_fifo_length();
if (length >= MAX_FIFO_SIZE )
{
digitalWrite(LED_BUILTIN, LOW);
myCAM.clear_fifo_flag();
return;
}
if (length == 0 ) //0 kb
{
digitalWrite(LED_BUILTIN, HIGH);
myCAM.clear_fifo_flag();
return;
}
myCAM.CS_LOW();
myCAM.set_fifo_burst();//Set fifo burst mode
// SPI.transfer(0x00);
char VH, VL;
int i = 0, j = 0;
for (i = 0; i < 241; i++)
{
for (j = 0; j < 321; j++)
{
LoRa.beginPacket();
VH = SPI.transfer(0x00);;
VL = SPI.transfer(0x00);;
// Serial.write(VL);
LoRa.print(VL);
delayMicroseconds(12);
// Serial.write(VH);
LoRa.print(VH);
LoRa.endPacket();
delayMicroseconds(12);
}
}
// Serial.write(0xBB);
// Serial.write(0xCC);
myCAM.CS_HIGH();
//Clear the capture done flag
myCAM.clear_fifo_flag();
}
delay(3000);
}
uint8_t read_fifo_burst(ArduCAM myCAM)
{
uint8_t temp = 0, temp_last = 0;
uint32_t length = 0;
length = myCAM.read_fifo_length();
// Serial.println(length, DEC);
if (length >= MAX_FIFO_SIZE) //512 kb
{
// Serial.println(F("ACK CMD Over size. END"));
return 0;
}
if (length == 0 ) //0 kb
{
// Serial.println(F("ACK CMD Size is 0. END"));
return 0;
}
myCAM.CS_LOW();
myCAM.set_fifo_burst();//Set fifo burst mode
temp = SPI.transfer(0x00);
length --;
while ( length-- )
{
temp_last = temp;
temp = SPI.transfer(0x00);
if (is_header == true)
{
// Serial.write(temp);
}
else if ((temp == 0xD8) & (temp_last == 0xFF))
{
is_header = true;
// Serial.println(F("ACK IMG END"));
// Serial.write(temp_last);
// Serial.write(temp);
}
if ( (temp == 0xD9) && (temp_last == 0xFF) ) //If find the end ,break while,
break;
delayMicroseconds(15);
}
myCAM.CS_HIGH();
is_header = false;
return 1;
}
Discussion
There were lots of issues getting the system working, but I am very happy that it works despite it being a bit of an out-of-the-box type of application for LoRa. I spent way more time trying to get ArduCam libraries to play nicely with Adafruit libraries than I am willing to admit. I just figured there should be no reason why an SPI display shouldn't coexist with a SPI camera if they have separate chip selects, but apparently there is some issue. They work great when alone, but not together. Point-to-point LoRa is also not documented well. I guess the main applications are expecting to use a LoRa gateway, but gateways definitely have guidelines on how much data you can send, and images do not fit within those guidelines. The lack of documentation and examples, slowed me down quite a bit - tying to deduce how everything worked. The fact that it takes 40 minutes to send a picture, really plays havoc with any hope of rapid debugging.
Given my state of health it was touch-and-go whether successful image transmission was achievable, so I am just ticked that it worked. When it takes 10 minutes to get past the white part of the image, when the first content finally started to appear intact, it was a special moment to remember.
Addendum
This addendum is by way of apology for any incoherency, lack of detail, lack of polish and missing information in the above blog. I have been in constant pain for the last 3 weeks, I cannot walk, cannot get down to my workshop or my video shooting area. I cannot sleep, I am so tired, I can't see straight or think straight and the pain and lack of mobility makes everything extremely difficult. But the deadline for this project is tomorrow and I hate to miss deadlines. I had to cobble these videos together at my upstairs computer which is not a good place to be shooting video. Fortunately I have a 3D printer upstairs so I was able to get some prints done, although they are mostly not finished cases - they will just have to do. Hopefully I will be able to do a wrap-up blog by tomorrow because this project involved a number of systems and there is lot that hasn't been discussed yet.
Links:
Save the Bees - Machine Learning
Image Conversion to Integer Array for LCD Display