The participant imagined writing each alphabetic letter, which was transcribed and displayed on a screen. (Image Credit: Frank Willett)
A brain implant, developed as part of BrainGate research collaboration, allowed a man paralyzed from the neck down to communicate his thoughts by translating them into text. The brain-computer interface (BCI) uses AI to interpret neural activity signals created while handwriting.
During experiments, the man, called T5 in the study, only needed to perform one simple task: think about producing words with a pen and paper. While doing so, electrodes implanted in his motor cortex captured brain signal activity. Algorithms connected to an external computer interpreted these signals and decoded the man's pen movements that traced punctuation marks and 26 letters. Afterward, these were displayed on a screen.
"While handwriting can approach 20 words per minute, we tend to speak around 125 words per minute, and this is another exciting direction that complements handwriting. If combined, these systems could together offer even more options for patients to communicate effectively," Krishna Shenoy, Ph.D. professor of electrical engineering, said.
T5 managed to achieve 90 characters per minute (approximately 18 words per minute), with 94% accuracy. With autocorrect, he achieved 99% accuracy. This is a significant improvement compared to initial BCI experiments, such as virtual keyboards. The researchers also said it's nearly equivalent to typing speeds of those in T5's age group, which is 115 characters or 23 words per minute.
"We've learned that the brain retains its ability to prescribe fine movements a full decade after the body has lost its ability to execute those movements," Frank Willett, a research scientist in the lab, says. "And we've learned that complicated intended motions involving changing speeds and curved trajectories, like handwriting, can be interpreted more easily and more rapidly by the artificial-intelligence algorithms we're using than can simpler intended motions like moving a cursor in a straight path at a steady speed."
Each alphabetical letter has a different shape, which means the AI can rapidly decode the user's thoughts while drawing the character. This process is a lot quicker than other BCI systems that don't use various inputs in the same manner. However, the team says this system is only a proof-of-concept that works with one participant so far, and it's not a clinically viable product.
Next, the researchers plan to train other people to use the interface, refine the system's sensitivity, implement additional editing tools for the user, and feature more symbols such as capital letters.
Have a story tip? Message me at: http://twitter.com/Cabe_Atwell