This webinar has been moved to Wednesday May 12th at 11:00am CT / 4:00pm GMT.
For humans, sight is one of our most valued senses. Despite this, virtually all embedded devices are still blind, having to rely on other sensors to recognize events that might be easily perceived visually. With this in mind, Edge Impulse has added computer vision, allowing any embedded developer to collect image data from the field, quickly build classifiers to interpret the world and deploy models back to production low-power, inexpensive devices. Edge Impulse enables low power, yet high-performance computer vision for any device with at least a Cortex-M7 or equivalent microcontroller. This allows any embedded developer to collect image data from the field, quickly build classifiers to interpret the world and deploy models back to production low-power, inexpensive devices. This means that a whole new class of devices can now accurately predict what they're seeing - enabling great opportunities from predictive maintenance ('does my machine look abnormal') and industrial automation ('are labels placed correctly on these bottles') to wildlife monitoring ('have we seen any potential poachers').
In this workshop, Kwabena Agyeman and Louis Moreau from Edge Impulse will show attendees how to quickly train a TensorFlow Lite Convolutional Neural Network (CNN) for image classification with Thermal Imaging using an OpenMV Cam and Edge Impulse, in particular, for wildlife conservation efforts. Then deploy the CNN on an OpenMV Cam and have it running immediately. Finally, a live demo will show you how to easily control devices in the real-world.
What you will learn by attending:
- How to easily import an image dataset of animal classes and train models using Edge Impulse
- Deploy this custom CNN on an OpenMV Cam using the OpenMV IDE and to detect animal presence
- A live demo using OpenMV and Edge Impulse