While we wait for the judges to deliberate on things, I wanted to reflect on the project and give some updates on what I've been doing since the contest officially closed.
Ways to improve:
Here are some ideas I've had or have been suggested by comments or people I discussed the project with -
- Fire Extinguisher - My first post had a comment about adding a fire extinguisher - I think this is a very cool idea but would be a much longer term undertanking. I would need to do a lot of research and testing to validate that I can detect real dangerous flames versus a literal "flash in the pan".
- Shielding the sensors and Raspberry Pi - I would like to do a lot more to protect the sensors in the nasty environment above the stove.
- Conformal coating - always a good idea for all the circuit boards in this setting. Maybe I'll just use Nail Polish...
- Glass/windows for the camera & GridEye - I spent some time researching what I would need to keep the GridEye protected but it would be very expensive. I also found that the little slots in the metal insert would cutoff the field of view on the left and right sides of the cameras if they were too far recessed. I had to make new dual camera mounts which pushed the cameras down closer to the stove so their views were not restricted.
- Here is a link to one optical window I was looking at from Edmunds Optics - the cheapest option is $155 for a 12mm circular window of the stuff. Too much for this project! But that is at least the "correct" stuff that should be used.
Code updates:
Rate-limiting data throughput
One thing which can be noted in blog post #7 is how the UI wasn't reporting the spike in humidity as I breath on the sensor. The system reacts properly by activating the fans, but the UI never responds with the current reading. The code at that time was a simple "rate limit" node which only passes a reading after a given time period. This was the "quick and dirty" way of getting the system running, but was not the most elegant way. The algorithm itself is constantly polling the sensors (10 times/second in other applications I've done) but we can't just throw all this data into a UI or log in a database since /those/ use cases are more "long term trends" and "big changes". What I've done with Arduino in the past was to code out a special type of rate limit - only pass data every X minutes unless the process variable exceeds N threshold versus the previously reported value. This is the magic key to keep the algorithm running full speed, keep long term data without wasting space or overhead, and still be able to capture fast moving variables which are masked by the reporting window. This is somewhat related to the Nyquist theorem - be sure you can read the data fast enough for all use cases. I wasn't sure what the best was of doing this would be in Node Red, so I was leaving this part until later in the project so I could focus on more major parts of the project.
This image shows how the GridEye /was/ being read - we inject a timestamp every X seconds; read out the sensor, the find the temperatures as seven individual datapoints. The problem with this was that the Temperatures Chart was getting 7 updates every X seconds and trying to refresh the UI. A high read rate makes the algorithm responsive but crushes the UI. A slow read rate helps the UI to be responsive, but kills the responsiveness of the algorithm.
What I found was the there is a built-in function which does basically this exact thing - the Delta Node. This keeps track of the timestamp and data payload of the previous packet which it allowed through. Here are the settings I used for the Max Temperature Delta node:
This passes a single payload every five minutes regardless of it numerical value. It also will pass a payload and resets the timer if the incoming payload is more than 5.00 [degrees] different than the previously passed value. The /more/ final flow is shown below.
This simple change then allows the best of both worlds - I can crank up the read time on all the sensors so the system is responsive; and the UI will stay responsive since it isn't getting flooded with updates. I added the Delta node on to all the sensor flows where they report either to the MQTT broker or push data into the Dashboard.
Physical buttons -
With the terminal loss of the touchscreen in the final week of the contest, I lost the ability to do and test many of the things I wanted. The buttons appeared to be working; at least the interrupt was firing when I pushed them; but I couldn't get any sample code from 4D Systems to run properly to read them. I go so desperate that I finally asked on their forum and they were very responsive and helpful with answering my questions and posting (or attempting to post) a known working copy of the sample code. When the new screen arrived, I realized that the sample code was never going to run properly since the broken screen wasn't be correctly seen by the OS - specifically I kept getting error messages about /dev/FB1 which was the frame buffer for the screen. The code worked properly on the new screen but due to the official close of the contest and my family coming to visit that weekend, I never got the physical buttons working. Since I've gone down from using 300% of my free time on this to a more manageable 10-15% of my free time (which I don't really have anyways) I've only played a little with the physical buttons. The are definitely in the works and will be implemented shortly. For now, all the functions are controlled from the touchscreen. The truth is that the hood does a very good job of running itself, turning on and off without /any/ human interaction. The lights have been added to OpenHAB on my cell phone and our tablet interface in the Kitchen as well.
User Interface & usability
Once again with the inability to do much full system testing with the UI on the touchscreen; I didn't consider that there was going to be performance issues. Once the new screen was in, it technically worked; but was laggy and sometimes unresponsive. After some research and applying what I suspected to be a related problem, I was able to resolve this. There were three issues - limited processing power on the Pi versus a PC; the extra oomph required for the software frame buffer for the screen, and the biggest elephant in the room was my gross misuse of the graphs in the NodeRed dashboard. All my testing with the graphs was done on either my phone as shown in the video in blog post #12 or on a regular computer which both have much faster processors than the Pi. The problem which was directly related to the previous section in this post; was that I was abusively shoveling data at the graphs forcing a UI refresh essentially constantly. As far as programming goes, we should all know that refreshing the screen is a very processor intensive task. By this point in the testing, I had about 6 graphs with loads and loads of data sets asynchronously polling sensors and pushing updates. See the section above for how I resolved this and cleared up another coding issue.
Other sensors:
There are other sensors I would have liked to add or have considered adding through this project.
Ambient light sensor - I was very close to adding a photo-resistor and reading/reporting ambient light. This would/should be included in a commercial product. I had considered using the visible light camera to take periodic images and read average lightness values. The best use case of this is for back light control on the 4D systems. See also below...
Motion sensor - This could have a few uses - Night light when we walk by and the room is dark; or automatically turn on the lights above the stove as needed. I could even use the visible light camera as a motion sensor for the area above the stove then a different PIR sensor for the general kitchen area near the stove. See above. I actually started looking at using a Sharp IR distance sensor for this use case; but it read out analog and I didn't have a free analog input. The alternative was to use a simple comparitor circuit that would then feed a digital input when a person was close to the stove; but that was too far out of scope for the timeline.
Backlight control - both of the above could also do backlight control on the touchscreen. For now I may use an external ambient light sensor that I have embedded in the tile backsplash.
PCB revisions - I would like to revise both PCBs. The main pcb should be drastically reduced in size since it is technically too large to fit nicely inside the hood. I also missed wiring one trace and had a physical interference which should be resolved. The MQ board had the pin headers reversed for the air sensors, so the heater and the sense were backward and I had to repin the connector to make up for it.
Along with that, using a Pi Zero W would also reduce the require real estate for the control board package.
A Self-Contained system
Another objective I had was that the system should be self contained as much as possible. I actually had to utilize this use case today - I was rebooting my sever which took down the MQTT broker, the OpenHAB Server, and a a few other key elements of the home automation system. Because we have kids, this simple operation which should have taken 10 minutes (installing updates) actually took three hours. Since I had designed the system to run stand-alone; none of this bothered the range hood. My wife was boiling some baby bottles on the stove and the GridEye was happily running the fan like nothing was wrong. When the server came back online, NodeRed reconnected seamlessly.
Standing on the shoulders of Giants
Not much of this would have ever been possible without all the numerous people who have open-sourced their code and published it for general consumption. With very minimal effort, I was able to use a camera, ADC sensor, thermal camera via I2C nodes, DHT Temp/Humidity sensor... It made this system much easier to implement than having to manually code out all of those communication protocols.
Top Comments