Alternatively, we can modify the coastal platform to watch over corals underwater instead of looking for boats in the surrounding area. The microcontroller we will be using should be able to do computer vision and as usual we’re going to use the raspi for this.
We could train our ML model like YOLO with Keras and Tensorflow, but we could do it in another way using EdgeImpulse. EdgeImpulse enables the use of tiny machine learning model to be deployed inside an IOT device.
The first step in deploying the coral classification system in regards to coral bleaching is to gather images of healthy corals and bleached corals. I cleaned the data by getting rid of unrelated images such as cartoons, icons, and graphs of corals.
After gathering the images, we can now train the model. Edge Impulse is easy to use in creating these model. After training the model it is interesting to note that the accuracy of the model is 77% considering that corals come in different shapes and size not to mention the color of coral bleach from white to brown due to brown algae surrounding the coral can be mistaken as part of the surroundings or the ground.
Deploying the Machine Learning model into the microcontroller such as the raspberry pi is done through WebAassembly. WebAssembly containerizes the custom machine learning model to be deployed in your choice of hardware.
Limitation: this setup requires an active internet connection to which the floating platform at this time lacks the components necessary to maintain such an active connect over long distances. Would love to incorporate this in the future as the project evolves.
Check out Edge Impulse.
Code Repo for deploying ML model in raspi via balenacloud.