The new system uses brain signals called ErrPs that makes it easier for humans to communicate with robots. At least Baxter looks like he’s having fun while on the job. (Photo via MIT)
Robots have come a long way over the years and even our perception of what makes a robot has changed. Whether they’re helping with military training or just dancing in sync with one another, robots have only gotten smarter. But this doesn’t mean it’s easy to make them do what they’re supposed to. Usually, they have to be designed in a specific way to make them understand how humans communicate. MIT wants to make this a bit easier. Their researchers have developed a way to control robots using hand gestures and brainwaves.
The new system uses brain signals called “error-related potentials” (ErrPs). These signals happen naturally when people notice a mistake. The team monitors the brain activity of someone watching the robot working. If an ErrP occurs, the robot will stop what it’s doing so the user can fix it. The person can then make hand gestures using an interface that measures muscle activity, to find the correct option for the robot and help it's complete its task.
For one test, they used Baxter, a robot from Rethink Robotics, to move a power drill to one of three possible targets on the body of a mock plane. Using the new system, Baxter went from picking the right option 70 percent of the time to more than 97 percent of the time. Not only were the results good, but the test also showed the system works people it’s never seen before ensuring organizations could use it in real-world settings without needing to train other users.
Joseph DelPreto, the project’s lead author, points out how the machine now can adapt to the user; normally it’s the other way around. He says with this new system you don’t need to train the user to think “in a prescribed way.” Still, the system isn’t perfect. To create it, the team used the power of electroencephalography (EEG) for brain activity and electromyography (EMG) for musicale activity. The problem is EEG signals aren’t always easy to detect, and EMG signals can be difficult to track motions that are more specific than left or right. Luckily, the system is more robust by combining them together.
Because the system can detect a person’s gestures along with their observations about whether something is going wrong, it makes communicating with the robot “more like communicating with another person.”
Since this new system makes it easier to communicate with robots, it’s ideal for working with the elderly or for workers with language disorders or limited mobility. And it puts us a step closer to developing robotic systems that feel more natural to us and will easily adapt to the way we work.
Have a story tip? Message me at: cabe(at)element14(dot)com
