Prelude
Disability due to illness or aging should not be a limit to express the feelings and the emotions. And one of the preferred channels our soul communicates out is through art.
People with disability may have several ways to express their art, but the most impressive form of art is, in my opinion, visual arts, like painting and drawing
In this challenge, I'd like to give everybody the opportunity to draw
Architecture
The project will be made up of two main components
- a eye tracker device
- a robot that can draw on a drawing board
The eye tracker continuously reads the moments of the eyes of the artists.
Movements are sent to the robot, that will carry a pen around the drawing surface
Eye tracker
There are many ways to implement the eye tracker.
As the first option, I’d like to implement a camera-based eye tracker, based on the detection of eye movements by means of a camera
There are plenty of implementations of eye tracking algorithm. I will start from PyGaze, that will run on a Raspberry board.
They main advantages of this approach can be synthesize as follow:
- provides a very good measure of the eye movements. This is important since the drawing robot can also be controlled in speed (not only in position)
- the technology is stable and tested, so the probability of success is greater than for the EOG-based approach
- it allows to detect actions like eye blinking
However,
- the resulting sensor is bulky, expensive and power-hungry
- the sensor requires a support to be mounted on
For this challenge, I'd like to propose two solutions: the first one based on EOG, the second one based on camera
As a second option (depending on the amount of time available), I’d like to explore a EOG-based eyetracker. Straight out of Wikipedia,
Electrooculography (EOG/E.O.G.) is a technique for measuring the corneo-retinal standing potential that exists between the front and the back of the human eye. The resulting signal is called the electrooculogram. Primary applications are inophthalmological diagnosis and in recording eye movements. Unlike the electroretinogram, the EOG does not measure response to individual visual stimuli.
To measure eye movement, pairs of electrodes are typically placed either above and below the eye or to the left and right of the eye. If the eye moves from center position toward one of the two electrodes, this electrode "sees" the positive side of the retina and the opposite electrode "sees" the negative side of the retina. Consequently, a potential difference occurs between the electrodes. Assuming that the resting potential is constant, the recorded potential is a measure of the eye's position
Based on this concept, I will develop a glasses frame with integrated the electrodes to measure the corneo-retinal standard potential. The biosignal is amplified and analyzed to determine the position of the eyes.
The main advantage of this approach is that the resulting sensor
- is small and easy to wear
- is very cheap, since there is no need for big computational resources
However, is also presents several drawbacks
- the sensor provides rough information about eye positions: it can just say if you are looking to the left, right, up or down
- it uses biosignals, that are difficult to handle without a specific expertise in small analog signal processing (filtering and amplification)
The reason I'd like to explore this option is because an EOG sensor could be used in other projects as an wearable hand-free remote control. However since, as I told before, I don't have any expertise in this kind of technology, there are many threats that could bring the whole project to failure. This is why I'm going to keep a backup approach
The drawing robot
The idea is to build a robot that can hangs on a drawing board and make pictures
There robot will have two motors mounted at the top and controlled by NXT Motor controller. The motors have bobbins wound with strings. The strings are connected to a plotter that holds a set of pens.
The position of the pen can be easily determined since the distance between the motors is known
The NXT Motor controller gives feedback about angular position of each motor. Also, the circumference of each bobbin is known so the length of each cord required to move the plotter to a given position is easy to calculate using Pythagorus’ theorem