A schematic would be nice with resistor values & stuff too that's if I want to make it. & I totally do!
Take a look at this paper - I just learned of it today - interestingly the apparatus used was an Arduino - as you too suggested - to control 6 specified vibration motor haptic actuators placed on the back of the hand. Apparently a whole lot of work has been done toward to advance reading text with the skin. The paper suggests word/gesture pattern recognition was preferred over individual character recognition by some participants taking part in the study. Speaking of getting lost - this paper is easy to get lost in. Be that as it may, i think the field is ripe for exploration, research, development, and commercialization - especially for first responders, the blind, and perhaps gaming - making it covertly fun to learn braille via a hidden wearable.
Hopefully you will find the paper interesting and can better appreciate some of my rambling concepts and notions.
The wireless, wearable, and dynamic single braille haptic actuator cell envisioned should be capable of receiving voice and or text to braille transcriptions, into touch data "streamed" to the skin - analogous to sound vibrations streaming to the ear., such that the human body skin could be thought of as a computer monitor for the mind to read the successive displayed patterns. The person wearable would be a system, that includes a computer/cell phone, microphone, tactual belt, gloves, or other garments containing the haptic actuators,
Candidate locations for the actuators would of course include the arms, torso, and other areas that would be tailored to the various duty factors - for example if a person had to crawl in a prone position, actuators should not be on their belly.
It would seem to me that the technology to direct the location of sprayed ink in inkjet printers would be much more complex than vibrating six actuators in 63 different patterns. Also in say 40 character refreshable static displays, all 40 of the characters are serially actuated to create a raised dot display, so that it would seem that the same technology could be used to drive a single character display with rapid updates, say 4 to 5 per second, so that a well trained person could potentially read abut 60 words a minute via touch.
The advantage of a dynamic display is the the person becomes the display, rather than a device.
Hope this makes sense. Your thoughts?
Hm, so it sounds like there may be more factors involved that I originally thought.
1. It needs to be able to send and receive data wirelessly. So it may need to use bluetooth and maybe talk to their cell phone? Where would the signal data come from that tells the haptic sensors what letter to make?
2. If on two separate hands, those will likely not be wired together, so you'll end up using double the hardware there, likely. Including two power sources.
I wonder if rather than being on gloves, with one on each hand, if instead there could be a forearm cuff containing the entire unit. Maybe the actuators press into the underside of the forearm since that area is more sensitive, and there could be some sort of keypad on top for typing out the braille letters.
This sounds like you're maybe trying to make a sort of beeper/pager that presents the letters/words through touch rather than sight. Is that correct? Would the user want the ability to repeat a message in case they missed part of it?
The hardware/circuit to make the haptic actuators make braille letters seems like it could be fairly straight-forward and simple. But the project as a whole seems complicated enough that you will likely need a smarter solution, ie. a MCU. So it might not be worth trying to design a simple solution for the haptic sensors, when you'll end up adapting the whole thing into a smarter system.
"...from a computer keyboard..."
What sort of computer keyboard did you have in mind for this ? Most modern ones are USB based so you may need a USB host keyboard decoder circuit to start with. Alternatively perhaps modify the keyboard to create your own matrix decoder circuit.
"I don't know that you can achieve this simply with discrete components."
You may find that a programmable logic device
would reduce the logic gate chip count, or if you can treat the logic as a truth table then you can use a memory type device to map input patterns to desired output patterns.
You're absolutely right about the updating of letters/braille code characters one at a time - I call it dynamic streaming. Blind people read by sweeping their fingertip over a static display of raised dot patterns - found in paper braille books or mostly refreshable braillers nowadays. They can read about 4-4 characters per second that way. The average word length is said to be about 4 letters, as in "The quick brown fox jumped over the lazy dogs."
The idea for an end product is a wearable that enables dynamic streaming of information directly to the skin via haptic actuators that vibrate sufficient for a 2x3 matrix pattern mapped to the body skin is cognizable. Many first responders work in environments wherein seeing and hearing ability is reduced and a sense of touch wearable could augment communication. Many first responders wear gloves and reading a static display of braille information would be out of the question. However it is envisioned that gloves could be fitted with actuators to receive and transmit transmissions wirelessly. For example, on the left hand the index, middle, and ring finger glove tips could house haptic actuators 1, 2, and 3 respectively and the right hand house 4,5, and 6 in similar fashion. Multi-tasking actuators, like piezoelectric in the receive mode would vibrate the fingertips electrically - in the transmission mode the fingertips would press the actuators mechanically to transmit an electrical signal - in this mode the fingertips would function as a Perkins brailler keys that types braille onto paper, such that for example "g" could be transcribed fy simultaneously pressing the index and middle fingers of both hands.
Wearables could be tailored for the use group - sewing in the actuators at locations in garments that make sense and do not interfere with physical duties.
Blind people could benefit with wearable dynamic braillers by not having to carry around static displays - information would stream to the skin one character at a time - this would be no different than one character at a time picked up by the fingertip - reading speed should be about the same, I would think.
Studies have shown that the sense of touch is about 75 times faster than the sense of sight and the same area of the brain processes both visual and haptic patterns.
In closing, I hope this helps to advance things along.
I had never really looked closely at braille before, but how it's broken down into decades makes it much easier to understand. Thanks for pointing that out.
As for your project, I'm not entirely clear on what the final product would look like and how you see it functioning.
It sounds like you want to be able to have a computer keyboard, press an alphanumeric key and have the 6 actuators/LEDs show that character. Would it only show one character at a time?
So if I type A, I'd get an array of:
But as soon as I type another character, it would update the display and A would no longer show? Is the data stored somehow? I'm thinking with the complexity, you'll probably need to use an Arduino or other microcontroller. I don't know that you can achieve this simply with discrete components. I think it's going to require some code.
What is the objective of this project?
Yes, thank you. It does sound complicated. I was thinking after watching you other video on priority encoders that the output from a computer keyboard could, in lieu of displaying letters, etc. on a monitor, display LED patters arranged in a 2 column by 3 row matrix using logic similar to that used in displaying digital numbers in the 7 LED arrangement - wherein 6 of the 7 the LED lights would be rearranged in a 2x3 matrix to simulate with lights braille code one character at a time In braille, raised dots are numbered 1-3 in column 1 and 4-6 in column 2. Each letter can be simulated by a particular set of switches are turned on - for example 1 0 0 0 0 0 represents the letter "a". The braille alphabet is arranged in 3 decades a-j. k-t, and u-z. The first decade uses 1,2,4,and 5 positions. The second decade uses the first decade positions plus position 3. The last decade uses the first decade positions plus both positions 3 and 6, except the letter "w" does not fit the pattern as "w" is not in the French alphabet when braille was first invented by Louis Braille - so "w" , which came along later, ha its own unique pattern. So the switch positions for the three decades are after the following on/off combinational patterns (I think I have these right with no typos):
"a" is 1 0 0 0 0 0 "k" is 1 0 1 0 0 0 ""u" is 1 0 1 0 0 1
'b" is 1 1 0 0 0 0
"c" is 1 0 0 1 0 0 "w" is 0 1 0 1 1 1
'"d" is 1 0 0 1 1 0
"e" is 1 0 0 0 1 0 "z" is 1 0 1 0 1 1
"f" is 1 1 0 1 0 0
"g" is 1 1 0 1 1 0
""h" is 1 1 0 0 1 0
"i" is 0 1 0 1 0 0
"j" is 0 1 0 1 1 0 "t" is 0 1 1 1 1 0
I hope this makes it seem less complicated - thanks for being willing to help - I've learned so much from watching you tutorials.