This was a really great teaching video - am learning so much. Assuming a QWERTY keyboard is a priority encoder, could it be used to drive a 7 segment LED lighting display to simulate dot patterns of six dot binary braille code. For example, the typing the letter "a" would light only segment 1 as in 1 0 0 0 0 0 for six of the switch positions. In lieu of lights , I would like to use haptic actuators to dynamically "print" braille directly to the skin in a wearable array of haptic actuators . Streaming dynamic braille directly to the skin would be great for first responders who often wear gloves and can read braille by sweeping fingertips over static displays. I think there should be a way to stream voice/text to braille directly to the shin analogous to sound streaming to the ear. Anybody interested or is this a crazy intractable idea?
Good description of decoders, encoders, mux and demux.