Machine learning (ML) technologies bring intelligence to homes, retail, factories and cities, where it detects patterns, anomalies and initiates responses.
The edge is ideal for deploying machine learning applications. Unlike the cloud, processing at the edge is done in real-time, closer to the user. Insights and localized decision-making allow for more seamless, synchronized, efficient user experiences.
https://www.eetimes.eu/top-10-processors-for-ai-acceleration-at-the-endpoint/Improvements in neural network model efficiency and the emergence of high-speed neural network accelerators are helping machine learning shift to the edge. A great example of the possibilities is the NXP i.MX 8M Plus, a new addition to our EdgeVerse portfolio. It provides dedicated machine learning hardware—a neural processing unit (NPU)—that can perform inferencing even on the most complex neural network models. Developers can off-load machine learning inference functions to the NPU, allowing the high-performance Cortex-A and Cortex-M cores, DSP and GPU to execute other system-level or user applications tasks.
The NPU, combined with the i.MX 8M Plus’ dual image signal processors (ISPs) and GPU, enable real-time image processing applications such as surveillance, smart retail applications, robot vision and home health monitors.
For engineers already using i.MX processors across industrial, automotive and medical, it’s a natural step to adopt machine learning in their edge applications using the i.MX 8M Plus processor, or for that matter, other i.MX processors as well. Supported by NXP’s eIQ machine learning development environment, engineers can seamlessly switch between running their machine learning models on CPU, GPU or NPU.
These are just a few reasons why the i.MX 8M Plus landed on EE Times’ list of Top 10 Processors for AI Acceleration at the Endpoint.
Where You’ll See i.MX 8M Plus
Running ML at the edge, the i.MX 8M Plus enables voice, face, speaker and gesture recognition, object detection and segmentation, augmented reality, environmental sensors and control for anomaly detection.
ML and vision in embedded systems open new possibilities of seamless human-machine interactions. By executing ML algorithms at the edge, the system can analyze people’s behavior and detect facial details to estimate the person’s gender, age and even mood. Keeping the machine learning data processing at the edge preserves privacy, eliminates cloud dependency and provides instantaneous response time.
The range of applications made possible with the cost-effective i.MX 8M Plus spans people and object recognition for public safety, industrial machine vision, robotics, hand gesture and emotion detection. At left, hand gestures enable a touch-free selection from a vending machine—a capability that will become essential in a post-Covid-19 environment.
In the medical market, edge-based ML modules allow for automated, remote testing and monitoring with fast decision-making and high reliability. Medical professionals can provide immediate help when it is needed by monitoring data such as breathing and movement patterns.
There is an increasing drive in factory automation to integrate more vision-based systems and the i.MX 8M Plus meets this demand. In these environments, ML is involved in inspecting, analyzing and acting on data at the edge. The i.MX 8M Plus’ dual camera inputs inspect multiple angles of a product and the ML accelerator can be trained to determine good product from bad. It also has the capability to recognize gestures from operators as well as operate and determine safety parameters.
The Future of ML at the Edge
The need to securely process large amounts of data captured at the edge and eliminate cloud latency are driving this shift. Learn more about the i.MX 8M Plus.