Overview
One of the advantages of using a small device such as the Arduino Nano BLE Sense with TinyML is that it could be used as a remote low powered sensor to detect movement or even if there is a person in the area or not. This example uses the Person Detection example from the TensorFlow Lite for Microcontrollers and then add BLE to send a signal to a second node to trigger a relay or for other processing.
Related Post:
TinyML on Arduino Nano 33 BLE Sense - Gesture Recognition
TinyML on Arduino Nano 33 BLE Sense - Fruit Identification
BLE on Arduino Nano 33 BLE Sense - BlueZ Connect
BLE on Arduino Nano 33 BLE Sense - Flask Remote Control
References:
Arduino Nano 33 BLE Sense
https://store.arduino.cc/usa/nano-33-ble-sense
TensorFlow Light - Person Detection Example
https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/micro/examples/person_detection
Arduino Project Hub - Person Detection Version
ArduCAM library
https://github.com/ArduCAM/Arduino
Required Libraries
- The Arducam library, to interface with the ArduCAM hardware
https://github.com/ArduCAM/Arduino
- The JPEGDecoder library, to decode JPEG-encoded images
https://github.com/Bodmer/JPEGDecoder
- Person Detection library
https://store.arduino.cc/usa/nano-33-ble-sense
- ArduCAM Mini 2MP
https://github.com/ArduCAM/Arduinohttps://www.element14.com/community/external-link.jspa?url=https%3A%2F%2Fwww.amazon.com%2FArducam-Module-Megapixels-Arduino-Mega2560%2Fdp%2FB012UXNDOY
Hardware
- Arduino Nano 33 BLE Sense
https://store.arduino.cc/usa/nano-33-ble-sense
- ArduCAM Mini 2MP
https://www.arducam.com/product/arducam-2mp-spi-camera-b0067-arduino/
- Single Relay Board - 5V
https://www.velleman.eu/products/view/?id=435570
Prepare Arduino Software
1. Install the JPEGDecoder library
2. Edit the JPEGDecoder User_Config Header file to remove SD Card defines
Ex:
// Comment out the next #defines if you are not using an SD Card to store the JPEGs // Commenting out the line is NOT essential but will save some FLASH space if // SD Card access is not needed. Note: use of SdFat is currently untested! //#define LOAD_SD_LIBRARY // Default SD Card library //#define LOAD_SDFAT_LIBRARY // Use SdFat library instead, so SD Card SPI can be bit bashed
3. Download the ArduCAM libraries and copy the ArduCAM folder to the Arduino library folder
https://github.com/ArduCAM/Arduino
4. Edit the "memorysaver.h" file to ensure the OV2640 MINI 2MP PLUS device is selected.
//#define OV2640_MINI_2MP //#define OV3640_MINI_3MP //#define OV5642_MINI_5MP //#define OV5642_MINI_5MP_BIT_ROTATION_FIXED #define OV2640_MINI_2MP_PLUS //#define OV5642_MINI_5MP_PLUS //#define OV5640_MINI_5MP_PLUS
5. Download the Person Detection code from the TensorFlow Lite GitHub
https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/micro/examples/person_detection
6. I ran into an issue with the Arduino_TensorFlowLite libraries in the Arduino IDE so I removed them. The were still using the older experimental code.
7. Add the person_detection.zip code to the Arduino IDE
a. Add .ZIP Library
b. Edit the name in "library.properties" file for the person detection example.
Arduino\libraries\tensorflow_lite
name=tensorflow_lite version=2.1.0-ALPHA author=TensorFlow Authors maintainer=Pete Warden <petewarden@google.com> sentence=Allows you to run machine learning models locally on your device. paragraph=This library runs TensorFlow machine learning models on microcontrollers, allowing you to build AI/ML applications powered by deep learning and neural networks. With the included examples, you can recognize speech, detect people using a camera, and recognize "magic wand" gestures using an accelerometer. The examples work best with the Arduino Nano 33 BLE Sense board, which has a microphone and accelerometer. category=Data Processing url=https://www.tensorflow.org/lite/microcontrollers/overview ldflags=-lm includes=TensorFlowLite.h
c. The example should show in the Arduino IDE as such
8. Load and run the code
9. When the board first starts, ensure the camera is not pointing at a person for the first image capture.
10. Observer the Person Detection from the Arduino IDE Serial Monitor
a. No person detected
Attempting to start Arducam Starting capture Image captured Reading 1032 bytes from Arducam Finished reading Decoding JPEG and converting to greyscale Image decoded and processed Person score: 60 No person score: 245 Starting capture Image captured Reading 2056 bytes from Arducam Finished reading Decoding JPEG and converting to greyscale Image decoded and processed Person score: 143 No person score: 189 Starting capture Image captured Reading 3080 bytes from Arducam Finished reading Decoding JPEG and converting to greyscale Image decoded and processed Starting capture Image captured Reading 2056 bytes from Arducam Finished reading Decoding JPEG and converting to greyscale Image decoded and processed Person score: 69 No person score: 236
b. Person Detected
Starting capture Image captured Reading 3080 bytes from Arducam Finished reading Decoding JPEG and converting to greyscale Image decoded and processed Person score: 247 No person score: 46 Starting capture Image captured Reading 3080 bytes from Arducam Finished reading Decoding JPEG and converting to greyscale Image decoded and processed Person score: 240 No person score: 59 Starting capture Image captured Reading 3080 bytes from Arducam Finished reading Decoding JPEG and converting to greyscale Image decoded and processed Person score: 250 No person score: 48
Add BLE Central and Peripheral communication
Reference
Hardware:
- Add a second Arduino Nano 33 BLE Sense
https://store.arduino.cc/usa/nano-33-ble-sense
- Single Relay Board - 5V
https://www.velleman.eu/products/view/?id=43557
Arduino Library
- ArduinoBLE
- Central - PeripheralExplorer
- Peripheral - LED
Prepare the code
1. Modify the Person Detection code from the previous example and add the Central - PeripheralExplore code to detect a BLE Peripheral device and send a signal if a person was detected or not.
/* Copyright 2019 The TensorFlow Authors. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. ==============================================================================*/ #include <ArduinoBLE.h> #include <TensorFlowLite.h> #include "main_functions.h" #include "detection_responder.h" #include "image_provider.h" #include "model_settings.h" #include "person_detect_model_data.h" #include "tensorflow/lite/micro/kernels/micro_ops.h" #include "tensorflow/lite/micro/micro_error_reporter.h" #include "tensorflow/lite/micro/micro_interpreter.h" #include "tensorflow/lite/micro/micro_mutable_op_resolver.h" #include "tensorflow/lite/schema/schema_generated.h" #include "tensorflow/lite/version.h" // Globals, used for compatibility with Arduino-style sketches. namespace { tflite::ErrorReporter* error_reporter = nullptr; const tflite::Model* model = nullptr; tflite::MicroInterpreter* interpreter = nullptr; TfLiteTensor* input = nullptr; // An area of memory to use for input, output, and intermediate arrays. constexpr int kTensorArenaSize = 73 * 1024; static uint8_t tensor_arena[kTensorArenaSize]; } // namespace // The name of this function is important for Arduino compatibility. void setup() { Serial.begin(9600); while (!Serial); // Set up logging. Google style is to avoid globals or statics because of // lifetime uncertainty, but since this has a trivial destructor it's okay. // NOLINTNEXTLINE(runtime-global-variables) static tflite::MicroErrorReporter micro_error_reporter; error_reporter = µ_error_reporter; // Map the model into a usable data structure. This doesn't involve any // copying or parsing, it's a very lightweight operation. model = tflite::GetModel(g_person_detect_model_data); if (model->version() != TFLITE_SCHEMA_VERSION) { TF_LITE_REPORT_ERROR(error_reporter, "Model provided is schema version %d not equal " "to supported version %d.", model->version(), TFLITE_SCHEMA_VERSION); return; } // Pull in only the operation implementations we need. // This relies on a complete list of all the ops needed by this graph. // An easier approach is to just use the AllOpsResolver, but this will // incur some penalty in code space for op implementations that are not // needed by this graph. // // tflite::ops::micro::AllOpsResolver resolver; // NOLINTNEXTLINE(runtime-global-variables) static tflite::MicroOpResolver<3> micro_op_resolver; micro_op_resolver.AddBuiltin( tflite::BuiltinOperator_DEPTHWISE_CONV_2D, tflite::ops::micro::Register_DEPTHWISE_CONV_2D()); micro_op_resolver.AddBuiltin(tflite::BuiltinOperator_CONV_2D, tflite::ops::micro::Register_CONV_2D()); micro_op_resolver.AddBuiltin(tflite::BuiltinOperator_AVERAGE_POOL_2D, tflite::ops::micro::Register_AVERAGE_POOL_2D()); // Build an interpreter to run the model with. static tflite::MicroInterpreter static_interpreter( model, micro_op_resolver, tensor_arena, kTensorArenaSize, error_reporter); interpreter = &static_interpreter; // Allocate memory from the tensor_arena for the model's tensors. TfLiteStatus allocate_status = interpreter->AllocateTensors(); if (allocate_status != kTfLiteOk) { TF_LITE_REPORT_ERROR(error_reporter, "AllocateTensors() failed"); return; } // Get information about the memory area to use for the model's input. input = interpreter->input(0); /* BLE initialize */ // begin initialization if (!BLE.begin()) { Serial.println("starting BLE failed!"); while (1); } //BLE.scanForUuid("1801"); // start scanning for peripherals BLE.scan(); } // The name of this function is important for Arduino compatibility. void loop() { // Get image from provider. if (kTfLiteOk != GetImage(error_reporter, kNumCols, kNumRows, kNumChannels, input->data.uint8)) { TF_LITE_REPORT_ERROR(error_reporter, "Image capture failed."); } // Run the model on this input and make sure it succeeds. if (kTfLiteOk != interpreter->Invoke()) { TF_LITE_REPORT_ERROR(error_reporter, "Invoke failed."); } TfLiteTensor* output = interpreter->output(0); // Process the inference results. uint8_t person_score = output->data.uint8[kPersonIndex]; uint8_t no_person_score = output->data.uint8[kNotAPersonIndex]; RespondToDetection(error_reporter, person_score, no_person_score); /* Send inference via bluetooth */ Serial.println("Scanning for UUID"); BLE.scanForUuid("1801"); delay(500); // delay to find uuid BLEDevice peripheral = BLE.available(); Serial.print("Peripheral is: "); Serial.println(peripheral); BLE.stopScan(); if (peripheral){ // discovered a peripheral, print out address, local name, and advertised service Serial.print("Found "); Serial.print(peripheral.address()); Serial.print(" '"); Serial.print(peripheral.localName()); Serial.print("' "); Serial.print(peripheral.advertisedServiceUuid()); Serial.println(); if (peripheral.localName() != "PersonDetectionMonitor") { return; } //BLE.stopScan(); if (person_score > no_person_score){ sendAlert(peripheral,2); } else{ sendAlert(peripheral,0); } //peripheral.disconnect(); } else{ Serial.println("Peripheral not available"); //BLE.stopScan(); } //BLE.stopScan(); } void sendAlert(BLEDevice peripheral, int sendsignal) { // connect to the peripheral Serial.println("Connecting ..."); if (peripheral.connect()) { Serial.println("Connected"); } else { Serial.println("Failed to connect!"); return; } // discover peripheral attributes Serial.println("Discovering attributes ..."); if (peripheral.discoverAttributes()) { Serial.println("Attributes discovered"); } else { Serial.println("Attribute discovery failed!"); peripheral.disconnect(); return; } // retrieve the alert level characteristic BLECharacteristic alertLevelChar = peripheral.characteristic("2A06"); if (!alertLevelChar) { Serial.println("Peripheral does not have alert level characteristic!"); peripheral.disconnect(); return; } else if (!alertLevelChar.canWrite()) { Serial.println("Peripheral does not have a writable alert level characteristic!"); peripheral.disconnect(); return; } if (peripheral.connected()) { if (sendsignal > 0){ alertLevelChar.writeValue((byte)0x02); Serial.println("Wrote high alert"); } else{ alertLevelChar.writeValue((byte)0x00); Serial.println("Wrote low alert"); } } peripheral.disconnect(); Serial.println("Peripheral disconnected"); }
2. Compile and load this code on the Nano with the ArduCAM connected
3. Modify the Peripheral LED code and add a 16-bit Characteristic UUID and location data.
NOTE: This is designed to enable a Relay when a person is detected from the Central device
#include <ArduinoBLE.h> #include "Arduino.h" // Relay on D8 int relay = 8; // BLE PersonDetection Service BLEService PersonDetectionService("1801"); // BLE Alert Level Characteristic BLEByteCharacteristic alertLevelChar("2A06", // standard 16-bit characteristic UUID BLERead | BLEWrite); // remote clients will be able to get notifications if this characteristic changes void setup() { Serial.begin(9600); // initialize serial communication while (!Serial); pinMode(LED_BUILTIN, OUTPUT); // initialize the built-in LED pin to indicate when a central is connected digitalWrite(relay, LOW); // begin initialization if (!BLE.begin()) { Serial.println("starting BLE failed!"); while (1); } /* Set a local name for the BLE device This name will appear in advertising packets and can be used by remote devices to identify this BLE device The name can be changed but maybe be truncated based on space left in advertisement packet */ BLE.setLocalName("PersonDetectionMonitor"); BLE.setAdvertisedService(PersonDetectionService); // add the service UUID PersonDetectionService.addCharacteristic(alertLevelChar); // add the alert level characteristic BLE.addService(PersonDetectionService); // Add the battery service alertLevelChar.writeValue((byte)0x00); /* Start advertising BLE. It will start continuously transmitting BLE advertising packets and will be visible to remote BLE central devices until it receives a new connection */ // start advertising BLE.advertise(); Serial.println("Bluetooth device active, waiting for connections..."); } void loop() { // wait for a BLE central BLEDevice central = BLE.central(); // if a central is connected to the peripheral: if (central) { Serial.print("Connected to central: "); // print the central's BT address: Serial.println(central.address()); // turn on the LED to indicate the connection: digitalWrite(LED_BUILTIN, HIGH); // while the central is connected: while (central.connected()) { //Serial.println("Getting Alert Level:"); if (alertLevelChar.written()){ if (alertLevelChar.value()){ Serial.println("Got high alert"); digitalWrite(relay, HIGH); Serial.println("Set relay to HIGH"); } else{ Serial.println("Got low alert"); digitalWrite(relay, LOW); Serial.println("Set relay to LOW"); } } } // when the central disconnects, turn off the LED: digitalWrite(LED_BUILTIN, LOW); Serial.print("Disconnected from central: "); Serial.println(central.address()); } }
4. Compile and load this code on the Nano with the relay connected.
5. Start the Central Nano with the ArduCam
Serial Monitor output - Scanning
Attempting to start Arducam Starting capture Image captured Reading 2056 bytes from Arducam Finished reading Decoding JPEG and converting to greyscale Image decoded and processed Person score: 139 No person score: 194 Scanning for UUID
6. Start the Peripheral Nano with the Relay
Bluetooth device active, waiting for connections... Connected to central: d8:4c:fd:36:96:eb Got low alert Set relay to LOW Disconnected from central: d8:4c:fd:36:96:eb
7. Central Nano discovered Peripheral and sent a low for no person detected
Peripheral is: 1 Found f8:e4:92:83:c6:0a 'PersonDetectionMonitor' 1801 Connecting ... Connected Discovering attributes ... Attributes discovered Wrote low alert Peripheral disconnected Starting capture Image captured Reading 2056 bytes from Arducam
8. Central Nano detected a person and sends a high alert to the Peripheral
Starting capture Image captured Reading 3080 bytes from Arducam Finished reading Decoding JPEG and converting to greyscale Image decoded and processed Person score: 224 No person score: 94 Scanning for UUID Peripheral is: 1 Found f8:e4:92:83:c6:0a 'PersonDetectionMonitor' 1801 Connecting ... Connected Discovering attributes ... Attributes discovered Wrote high alert Peripheral disconnected
9. Peripheral received high alert and enables the relay
Disconnected from central: d8:4c:fd:36:96:eb Connected to central: d8:4c:fd:36:96:eb Got high alert Set relay to HIGH
This will go back and forth until the boards are reset or powered off.
Conclusion
These were simple examples but do show how TensorFlow Lite (TinyML) could be implemented in a remote edge device to detect people movement and then have that trigger either an alarm or a more powerful device to turn on and capture video or send the data via WiFi or other longer distance communication.