Artificial Intelligence has the potential to change many of the things we do. Take the poll and let us know how you think AI will be deployed in embedded applications, and please tell us why in the Comments section below!
Artificial Intelligence has the potential to change many of the things we do. Take the poll and let us know how you think AI will be deployed in embedded applications, and please tell us why in the Comments section below!
Mostly at sensors/nodes using dedicated AI processors
This option is my vote because it is the next logical step. Microcontrollers designed for embedded edge applications already have neural network accelerators built into them. And the current trend is to have that as a separate block within the MCU that operates without turning on the high-performance / high-power consumer general-purpose processing cores.
So if that AI/ML work can be moved to the actual sensor, it would mean be able to operate the high-performance microcontroller even less often. Granted, everything depends on the application. For example, maybe the sensor nodes run on battery power, and the computing node has constant power. So, like all things engineering, there is no one-answer for all situations.
Along those lines, I think we sometimes blur the line between Artificial Intelligence and Machine Learning. Frankly, much of the "Artificial intelligence" in edge computing devices is pattern matching based on a trained neural network. Which is machine learning and not decision-making. So there isn't much "intelligence,", especially in edge applications.
I make that point because it explains why I see the next step as moving the ML inferences to the sensor node itself. Now the sensor is smart enough only to transmit a result when it fits a model. Again, this method won't make sense in all applications, but I could see where it might in some.
Mostly at sensors/nodes using dedicated AI processors
This option is my vote because it is the next logical step. Microcontrollers designed for embedded edge applications already have neural network accelerators built into them. And the current trend is to have that as a separate block within the MCU that operates without turning on the high-performance / high-power consumer general-purpose processing cores.
So if that AI/ML work can be moved to the actual sensor, it would mean be able to operate the high-performance microcontroller even less often. Granted, everything depends on the application. For example, maybe the sensor nodes run on battery power, and the computing node has constant power. So, like all things engineering, there is no one-answer for all situations.
Along those lines, I think we sometimes blur the line between Artificial Intelligence and Machine Learning. Frankly, much of the "Artificial intelligence" in edge computing devices is pattern matching based on a trained neural network. Which is machine learning and not decision-making. So there isn't much "intelligence,", especially in edge applications.
I make that point because it explains why I see the next step as moving the ML inferences to the sensor node itself. Now the sensor is smart enough only to transmit a result when it fits a model. Again, this method won't make sense in all applications, but I could see where it might in some.
I too feel think this will definitely grow. This is already deployed now, e.g. companies like PointGrab already advertise their sensors use AI/ML, on-device. Although it's not technically on the actual "sensor chip" necessarily (they are using a standard thermal or imaging sensor most likely) it is in the "sensor product", i.e. in the sensor node device (in-device AI processor or software).
Some other manufacturers absolutely need to do it in-device, no other choice, since they cannot be sending large quantities of data wirelessly on battery power. They need to make the inference on-device.
Probably one main thing which will slow "sensor-and-AI-on-a-single-chip" a bit (although I'm sure some already exist) is the job layoffs/economy/demand on chip fabs, and so on. It may be a tough few years : (