The following webinar is now available for On-Demand Viewing:
Edge Impulse enables developers to create the next generation of intelligent device solutions with embedded Machine Learning. Machine Learning at the very edge will enable valuable use of the 99% of sensor data that is discarded today due to cost, bandwidth, or power constraints. Edge Impulse enables the easy collection of real sensor data, live signal processing from raw data to neural networks, testing and deployment to any target device. You can sign up for a free developer account and get started with the ST IoT Discovery board or the Arduino Nano Sense 33. Their open source SDKs allow you to collect data from or deploy code to any device. TinyML enables exciting applications on extremely low-power MCUs. For example, you can detect human motion from just 10 minutes of training data, detect human keywords and classify audio patterns from the environment in real-time.
![]() |
![]() |
Buy NowBuy Now |
Jenny Plunkett from Edge Impulse gave a fantastic presentation . She is a self-described Texas Longhorn and software engineer, now working as a User Success Engineer at Edge Impulse. Since graduating from The University of Texas I she's been working in the IoT space, from customer engineering and developer support for Arm Mbed to consulting engineering for the Pelion platform.
She was supported during the Q&A by Daniel Situnayake. Daniel is a founding TinyML engineer at Edge Impulse and the co-author of the definitive book on TinyML: "TinyML: Machine Learning with TensorFlow Lite on Arduino and Ultra-Low-Power Microcontrollers." He previously worked at Google as a Developer Advocate for TensorFlow Lite, enabling developers to deploy machine learning to edge devices, from phones to SoCs. He was also Developer Advocate for Dialogflow, a tool for building conversational AI.
{gallery} My Gallery Title |
---|
![]() |
{tabbedtable} Tab Label | Tab Content |
---|---|
Page 1 |
Q&A Session:
I've built a voice-controlled faucet where you can change the water temperature using voice commands but it only works well in my bathroom
|
Page 2 |
can the models be run considering a Low-Power Device? Like a Board running on battery, waking on specific events/times to collect and send, etc?
Yes
I am working on vibration/sound alerting blind walking stick which gets trained on terrains using sensors data.. Is it possible to do transfer learning with online data on device..to improvise stick based on terrains persons walking upon?
I think the best way to do this would be to archive the sampled data locally on your processor then upload the data to Edge Impulse for training purposes either in CSV or JSON format.
I am curious about implementing this for Image Processing. I am currently using ImageJ for processing images and would be interested to see where I can use Edge Impulse.
Try us out! The Edge Impulse online documentation has a tutorial for "adding sight to your sensors" which gets you bootstrapped very quickly on performing low power computer vision. See https://docs.edgeimpulse.com/docs/image-classification for more info on getting started. At the present time, you can use the OpenMV Cam H7 Plus, Himax WE-I Plus hardware to get the best out-of-box experience for embedded platforms. If you don't have any of these you can even use your mobile phone initially for some early experimentation!
We are planning to use Edge Impulse to classify ECG signals. Edge Impulse can be used for biosignals, but does the Arduino Nano BLE Sense 33 have the computational capability for such a project? Also could you talk about how an existing dataset could be imported to Edge Impuls
One way to get data in from your sensors into Edge Impulse is to use the data forwarder engine via the Edge Impulse CLI. This just requires a UART (Tx/Rx) connection from your device and you can directly get your gas sensor data into the Edge Impulse studio. You could then perform a series of classifications on that data (see the accelerometer example that's already available).
My 13 year old son has learned some basics of traditional programming at school. I explained the basics of machine learning to him, and his reaction was “well that makes more sense!” I wonder if kids find this method more intuitive in their data-filled world. Do you see machine learning, and Edge Impulse specifically, having a future in the schools and teaching kids?
Definitely, and I believe the world needs more experts in embedded systems which in some respects is becoming a bit of a lost art. Getting more kids exposed to STEM and embedded systems with platforms such as Edge Impulse will help ensure a future pipeline of embedded systems engineers and scientists!
Is only c or c++ is used for compiling?
Check out this tutorial here: https://docs.edgeimpulse.com/docs/running-your-impulse-locally
How are the custom blocks integrated in Edge impulse pipeline? Python scripts or what other options?
Hi, check out our custom blocks tutorial here: https://docs.edgeimpulse.com/docs/custom-blocks
I didn't see Raspberry Pi boards on the list of selection boards? Is it easy to add it or it's better to adapt our model to the compatible boards?
It's easy to add your deployed C inferencing library onto any device! Check out our forum for others in our community using a raspberry pi: https://forum.edgeimpulse.com/
can it be used to diagnose biosignals?
Absolutely, we have customers doing biosignal analysis on a variety of topics: from sleep stage detection to COVID onset detection./
Just to make it clear, is the collection stage done sending data from the device to the cloud or could it be from the device to a PC (serial/usb) and then to the cloud?
Training data collection can be done however you'd like, as long as you can upload it to the Studio: https://docs.edgeimpulse.com/docs/cli-uploader
Hi. Is it possible to add a support for Avnet Ultra96 board in your system? Ultra96 is Arm-based, Xilinx Zynq UltraScale%2B MPSoC development board.
The C++ Library export will run on almost anything with a C++ compiler, so you should be able to get this to work pretty quickly. See https://github.com/edgeimpulse/example-standalone-inferencing
I'm not too well versed in ML and TinyML. I'm assuming that 8/16 bit processors, including AVR8, PIC8, and 8051(and other 8/16 bit chipsets) aren't too powerful and can't run TF/TinyML, but how much power would be suitable? I'd assume ARM would be more suitable, and many boards supported include Cortex M0+, but is there any "loose" chip strength requirements to run TinyML?
Depends on the use case. E.g. gesture detection on accelerometer is very doable on M0+, audio on M4, vision M7 and A-class.
Can we use TimyML with some 32 bit MCU and plug in an mic and have the MCU identify who is talking based on just hearing their voice. How would you go about training/getting data for something like this.
Yeah, just try it out by uploading data of two different people speaking using your phone (see Data Acquisition tab in Edge Impulse) and train a classifier. Should be pretty quick.
if I have a frozen_graph.pb, can I use ei to convert the model to tflite?
No.
Can we deploy more than one model for two different functionalities on the same micro controller?
Not two completely different models at the moment out of the box, but you can mix and match learning blocks (e.g. both classifier + anomaly detection).
How would I implement a "do not know" category? Applied to your example this morning, instead of forcing the answer into the three categories you mentioned give a e.g. circular motion a "none of the above" answer ?
The machine learning model with output classifications that are "uncertain" as they do not correspond with any of the trained classes. You can also train another category/ML class to be "still motion" or something that is dissimilar to any other labeled motion in your dataset
How to implement Computer vision projects using TinyML?
Check out our blog post here: https://www.edgeimpulse.com/blog/computer-vision
What is kind of use case example using TinyML in agriculture? is it possible to deploy on rural area ?
Hi! Check out what our community has done with ElephantEdge: https://www.edgeimpulse.com/blog/smartparks
Could you detect sudden impact using arduino nano sens 33 board? and then train the board using edge impulse to detect abnormal shock? For example, if we were to create an airbag system
Yeah, but if the usecase is simple, e.g. here by just measuring total impact, I'd program it out rather than use ML for it.
Can I go as low a 2 bit quantization for my model weights?
Not in Edge Impulse at the moment
Is Edge Impulse free to use?
Yes! Sign up here: https://edgeimpulse.com/
Im also working on a early warning sytem for forest fires? whats ur opinion implementing tinyml
Good usecase, one of our partners (IRNAS) is doing fire detection using Edge Impulse in electricity poles, so would be interested to see what you'd come up with. |