Introduction
This is my final blog in the Experimenting with Gesture Sensors design challenge where I present my final attempt at developing a USB computer mouse using gesture sensing.
I would just like to thank element14 and Maxim Integrated for the opportunity to learn about and use the MAX25405 EV kit as part of this design challenge. I put the MAX32620FTHR development board through its paces and it certainly did the job very well.
I hope those who read my blog posts will learn something new too. If you have any comment or question please let me know.
During this journey I presented my design concept as my first blog and then documented a number of experiments along the way.
Blog #1: Introduction to Gesture Mouse
Experiment #1: Gesture Mouse USB Driver Test
Gesture Mouse Experiment #2: Using the Windows GUI Demo Software to test different gestures
Gesture Mouse Experiment #3: Polling the Gesture Sensor EV Kit using the default Serial API
Experiment 4: Making sense of the Gesture Algorithms found in the MAX25405 Firmware Framework
Experiment 5: Using the Processing IDE desktop application to help evaluate Centre of Mass values
Overall this was certainly an enjoyable learning-by-doing exercise on a number of levels, from starting with the basics of gesture sensing and then through to incorporating the sensor algorithms into a new embedded application using the MAX32620FTHR development board.
Whilst I never quite achieved my initial objective of developing a number of specific USB mouse gestures due to the problems encountered with the firmware framework provided by Maxim Integrated and general limitations of the sensor itself, the experimentation with the sensor actually helped reveal more about where this sensor was probably better suited.
In my opinion, the gesture sensor is reminiscent of mobile phone touch screens in the early years as it requires considerable testing and adjustment of parameters to improve performance. I think this sensor would work really well in an industrial setting as it takes time to learn how to trigger gestures consistently and it’s more likely that applications where specific single motion gestures are required will be within an industrial/manufacturing setting.
USB Mouse Movement
Getting a computer mouse to move across the screen is fundamental. So I wanted to tick this off the list.
In order to create my USB mouse I had to rely on the low level algorithms provided within the firmware framework as the USB serial output together with the calculated gesture results would not work for my application.
Thankfully I had spent a good amount of time working through the firmware framework and developing my own basic sensor library using Mbed OS 6.16.
Once I incorporated the Centre of Mass calculations in my MAX32620FTHR embedded library it turned out to be very straightforward to convert the calculated Centre-of-Mass x and y values into absolute mouse coordinates as the coordinate system is much the same in that the top left of a computer screen is 0,0 and the bottom right of screen is the maximum x and y values.
Another benefit is that the 10 x 6 pixel configuration is quite similar to the aspect ratio of a computer screen. So while this makes it compatible, you are still limited to using absolute coordinates, which might not always be suitable for all applications.
A new code update is provided in my GitHub repository: https://github.com/Gerriko/Max25x05_MbedOS6
Here is the changes made in main.cpp to handle mouse movements:
while (true) { // If using INTB interrupt, the sensorDataReadyFlag will be set when the end-of-conversion occurs if (max25x_1.sensorDataReadyFlag) { // Check the interrupt pin to see if PWRON was set [[TODO]] uint8_t newIntVal = 0; max25x_1.getSensorPixelInts(gesture_1.pixels, false); gesture_1.processGesture(WINDOW_FILTER_ALPHA, gesture_1.GEST_DYNAMIC); //serial.printf("%u, %d, %d, %d, %d\r\n", gesture_1.dynamicResult.state, (int)(gesture_1.dynamicResult.cmx*100.0), // (int)(gesture_1.dynamicResult.cmy*100.0), (int)sqrt((double)gesture_1.dynamicResult.CoM_Intensity), gesture_1.dynamicResult.maxpixel); if (gesture_1.dynamicResult.state == 1 && gesture_1.dynamicResult.cmx >= 0 && gesture_1.dynamicResult.cmy >= 0) { // Mouse movements int threshold = (int)sqrt((double)gesture_1.dynamicResult.CoM_Intensity); if (threshold > 50) { int16_t x = (int16_t)(gesture_1.dynamicResult.cmx*1800)+200; int16_t y = (int16_t)(gesture_1.dynamicResult.cmy*1800)+200; mouse.move(x, y); } } memset(gesture_1.pixels, '\0', NUM_SENSOR_PIXELS); max25x_1.sensorDataReadyFlag = false; } }
And here is my video demo showing USB mouse movement using the gesture sensor. For demo purposes I created a desktop application to highlight mouse pointer position. Notice how everything gravitates towards the top left of the screen.
USB Mouse Actions
Unfortunately, due to omission of the gesture algorithms within the firmware framework I would have had to reinvent the wheel to develop my own and I did not have the time to complete this properly. I had made a start and I had a good idea in my head as to how I would approach it, but there was not enough time to complete.
So here is very much a first attempt at using gestures to trigger mouse events. The short video below shows mouse click events.
Here I demonstrate my very basic algorithm at determining gesture events. I had ascertained that I could use the Centre of Mass "intensity" value to determine if a click gesture was or not. Note that I have given it the term "intensity" value from the Centre of Mass calculations. I reasoned that it must have been captured for a reason in the Firmware Framework algorithms even though it was never used. It seems to show a very good correlation between click gestures and peak values although it was not clear as to what threshold to use. As you will see in the video it is very hit or miss as either it does not detect the gesture or it gives a false positive. No doubt this could be improved upon through further testing.
I also decided to use some very rudimentary rules of thumb to determine gestures, once I had some sample data for Centre of Mass (CoM) x and y values. Here is an x-y scatter plot showing correlation between CoM x-y values and position of my hand relative to the gesture sensor. This helped me determine threshold values to know if my hand moved from left-to-right and vice versa or top-to-bottom and vice versa.
Here is the main.cpp code I used for the gesture movements to create mouse click events. It uses the same libraries as above. Within the code you will see plenty of threshold values being used, which is not very reliable (see video). I also inserted a timer to create a 1 second delay between when another gesture movement can be detected. This too could be optimised.
/* mbed Microcontroller Library * Copyright (c) 2019 ARM Limited * SPDX-License-Identifier: Apache-2.0 * * Application: Gesture Mouse Library test program BETA * Author: C Gerrish @December 2022 * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS * OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF * MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. * IN NO EVENT SHALL MAXIM INTEGRATED BE LIABLE FOR ANY CLAIM, DAMAGES * OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, * ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR * OTHER DEALINGS IN THE SOFTWARE. * * Microcontroller used: MAX32620FTHR * Target sensor: MAX25405 Gesture Sensor Kit * * Except as contained in this notice, the name of Maxim Integrated * Products, Inc. shall not be used except as stated in the Maxim Integrated * Products, Inc. Branding Policy. * * The mere transfer of this software does not imply any licenses * of trade secrets, proprietary technology, copyrights, patents, * trademarks, maskwork rights, or any other form of intellectual * property whatsoever. Maxim Integrated Products, Inc. retains all * ownership rights. */ #include "mbed.h" //#include "USBSerial.h" #include "USBMouse.h" #include <cmath> // Uncomment one device option for library to know which sensor is attached //#define MAX25205_DEVICE #define MAX25405_DEVICE #include "MAX25x05.h" #include "gesture_lib.h" #define USE_SPI 1 #if USE_SPI #include <MAX25x05_SPI.h> SPI MAXspi_1(P5_1, P5_2, P5_0); // mosi, miso, sclk // The default settings of the SPI interface are 1MHz, 8-bit, Mode 0. // The max SPI frequency for MAX25405 is 6MHz - in the library 2MHz (2e6) will be used MAX25x05_SPI MAXIObus_1(MAXspi_1, 2e6, P5_5); DigitalOut selPin(P3_2, 0); // SEL pin is set low to indicate to MAX25x05 that it's to use SPI mode #else #include <MAX25x05_I2C.h> I2C MAXi2c_1(P3_4, P3_5); // sda, scl // // The max I2C frequency for MAX25405 is 400k - in the library 100k (100e3) will be used MAX25x05_I2C MAXIObus_1(MAXi2c_1, 100e3, P5_5, 0); DigitalOut selPin(P3_2, 1); // SEL pin is set high to indicate to MAX25x05 that it's to use I2C mode #endif // Bytes prefixed to pixel data to create data frame const int NUM_INFO_BYTES = 40; volatile bool CanUpdate = true; void flip() { CanUpdate = true; } int main() { //setup USB Serial comms for configuration option //USBSerial serial(true, 0x0b6a, 0x4360, 0x0001); //serial.set_blocking (true); USBMouse mouse(true, ABS_MOUSE, 0x0b6a, 0x4360, 0x0001); MAX25x05 max25x_1(MAXIObus_1, P5_3); // Interrupt pin for sensor 1 //MAX25x05 max25x_2(MAXIObus_2, P3_3); // Interrupt pin for sensor 2 // Use the gesture library to manipulate/prepare pixels for output // ------------------------------------------------------------------ gesture_lib gesture_1(SENSOR_COLS, SENSOR_ROWS); //int16_t pixels[NUM_SENSOR_PIXELS] = {'\0'}; // Note if using 2 gesture sensors then the LED timings need to change [TODO] max25x_1.set_default_register_settings(); // Define for sensor number 1 //max25x_2.set_default_register_settings(); // Define for sensor number 2 max25x_1.enable_read_sensor_frames(); //max25x_2.enable_read_sensor_frames(); Timeout NextCheck; int16_t x[2] = {0}; int16_t y[2] = {0}; char Vmove[3] = {'\0'}; char Hmove[3] = {'\0'}; uint8_t Mbtn = MOUSE_LEFT; while (true) { // If using INTB interrupt, the sensorDataReadyFlag will be set when the end-of-conversion occurs if (max25x_1.sensorDataReadyFlag) { // Check the interrupt pin to see if PWRON was set [[TODO]] uint8_t newIntVal = 0; max25x_1.getSensorPixelInts(gesture_1.pixels, false); gesture_1.processGesture(WINDOW_FILTER_ALPHA, gesture_1.GEST_DYNAMIC); //serial.printf("%u, %d, %d, %d, %d\r\n", gesture_1.dynamicResult.state, (int)(gesture_1.dynamicResult.cmx*100.0), // (int)(gesture_1.dynamicResult.cmy*100.0), (int)sqrt((double)gesture_1.dynamicResult.CoM_Intensity), gesture_1.dynamicResult.maxpixel); if (gesture_1.dynamicResult.state == 1) { // Mouse movements int threshold = (int)sqrt((double)gesture_1.dynamicResult.CoM_Intensity); x[0] = x[1]; y[0] = y[1]; x[1] = (int16_t)(gesture_1.dynamicResult.cmx*1000.0); y[1] = (int16_t)(gesture_1.dynamicResult.cmy*1000.0); for (uint8_t i = 0; i < 2; i++) { Vmove[i] = Vmove[i+1]; Hmove[i] = Hmove[i+1]; } if (x[0] != 0 || y[0] != 0) { if (x[0] < 4000 && x[1] < 4000) { if (y[0] < 4000 && y[1] < 4000) { //serial.printf("T-L\r\n"); Vmove[2] = 'T'; Hmove[2] = 'L'; } else if (y[0] > 4000 && y[1] > 4000) { //serial.printf("B-L\r\n"); Vmove[2] = 'B'; Hmove[2] = 'L'; } else { //serial.printf("M-L\r\n"); Vmove[2] = 'm'; Hmove[2] = 'L'; } } else if (x[0] > 4000 && x[1] > 4000) { if (y[0] < 4000 && y[1] < 4000) { //serial.printf("T-R\r\n"); Vmove[2] = 'T'; Hmove[2] = 'R'; } else if (y[0] > 4000 && y[1] > 4000) { //serial.printf("B-R\r\n"); Vmove[2] = 'B'; Hmove[2] = 'R'; } else { //serial.printf("M-R\r\n"); Vmove[2] = 'm'; Hmove[2] = 'R'; } } else { if (y[0] < 4000 && y[1] < 4000) { //serial.printf("T-M\r\n"); Vmove[2] = 'T'; Hmove[2] = 'm'; } else if (y[0] > 4000 && y[1] > 4000) { //serial.printf("B-M\r\n"); Vmove[2] = 'B'; Hmove[2] = 'm'; } else { //serial.printf("M-M\r\n"); Vmove[2] = 'm'; Hmove[2] = 'm'; } } if (Vmove[0] > '\0' && Hmove[0] > '\0' && Vmove[0] != 'm' && Hmove[0] != 'm' && (Vmove[1] == 'm' || Hmove[1] == 'm') && Vmove[2] != 'm' && Hmove[2] != 'm') { if (CanUpdate) { if (Vmove[1] == 'm') { if (Vmove[0] == 'B') { //serial.printf("UP\r\n"); Mbtn = MOUSE_RIGHT; } else { //serial.printf("DOWN\r\n"); Mbtn = MOUSE_RIGHT; } } else { if (Hmove[0] == 'L') { //serial.printf("RIGHT\r\n"); Mbtn = MOUSE_LEFT; } else { //serial.printf("LEFT\r\n"); Mbtn = MOUSE_LEFT; } } //serial.printf("V: %c-%c-%c | H: %c-%c-%c\r\n", Vmove[0], Vmove[1], Vmove[2], Hmove[0], Hmove[1], Hmove[2]); CanUpdate = false; NextCheck.attach(&flip, 1s); } } //serial.printf("%d, %d\r\n",(x[0]+x[1])/2,(y[0]+y[1])/2); } if (threshold > 1300) { //mouse.move(x, y); if (CanUpdate) { //serial.printf("CLICK!\r\n"); mouse.click(Mbtn); CanUpdate = false; NextCheck.attach(&flip, 1s); } } } else { x[0] = x[1]; y[0] = y[1]; x[1] = 0; y[1] = 0; for (uint8_t i = 0; i < 2; i++) { Vmove[i] = Vmove[i+1]; Hmove[i] = Hmove[i+1]; } Vmove[2] = '\0'; Hmove[2] = '\0'; } memset(gesture_1.pixels, '\0', NUM_SENSOR_PIXELS); max25x_1.sensorDataReadyFlag = false; } } }
As you can hopefully see from the video, this is where it gets tricky using just one sensor to handle everything. You quickly hit limitations. For example, handling a click and release versus a click and drag movement is quite difficult to implement without the use of a button or another proximity sensor. Furthermore you cannot reliably create left and right button events - as hopefully the video demonstrated - I edited out a few missed gesture events to shorten the video.
Note that for the above video, I used a small Processing desktop app to capture mouse click events. Here is the code:
int BtnMode = 0; long t_now = 0; void setup() { size(600, 600); fill(100); stroke(100); textSize(48); } void draw() { background(0); rect(200, 100, 200, 400); if (BtnMode == 1) { text("LEFT BUTTON CLICK", 60, 80); } if (BtnMode == 2) { text("RIGHT BUTTON CLICK", 20, 80); } if (t_now > 0) { if (millis() - t_now > 2000) { t_now = 0; fill(100); text("", 20, 80); BtnMode = 0; } } } void mouseClicked() { if (mouseButton == LEFT) { fill(100,0,0); BtnMode = 1; t_now = millis(); } else if (mouseButton == RIGHT) { fill(0,100,0); BtnMode = 2; t_now = millis(); } }
So no doubt, with a bit of thought, it is still very much possible to create software-application-specific gesture movements with the MAX25405 sensor.
I think much like capacitive touch screens on phones and on touch screens, the software code handling movement on these screen needed bedding down and enhancement over time. Initially these screens were prone to error. The MAX25404 is a great and very useful gesture sensor and once the right type of application is found it will just need adequate development and testing time to get the software implementation to work.