
The robot can dress a human without seeing the arm during the process. (Image Credit: MIT CSAIL)
We all know robots possess excellent capabilities for everyday tasks that humans cannot perform due to limitations or difficulty. For instance, they can lift heavy objects or assemble products, such as watches requiring components too small for humans to see. MIT researchers developed a robot that can help people get dressed in sleeved clothing, building on last year’s work involving robot-assisted dressing without sleeves.
However, the main issue the robot faced was visual obstruction. “The robot cannot see the human arm during the entire dressing process,” Shen Li, a Ph.D. candidate in the MIT Department of Aeronautics and Astronautics, says. As a result, the robot cannot determine exactly how much force it should exert to pull the cloth from hand to shoulder.
To overcome the vision obstruction issue, the researchers developed a “state estimation algorithm,” which contains machine learning, allowing the robot to guess the elbow’s location and arm’s position. In this case, it can determine if it extends straight out, bends at the elbow, points upwards, downwards, or sideways. The robot’s measurement of the force applied to the cloth is fed to the algorithm, which then surrounds the elbow within a box, estimating its position.
The robot can then use this data to move safely. “If the arm is straight, then the robot will follow a straight line; if the arm is bent, the robot will have to curve around the elbow,” Theodoros Stouraitis, a visiting scientist in the Interactive Robotics Group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), says. “If the elbow estimation is wrong, the robot could decide on a motion that would create an excessive, and unsafe, force.”
A dynamic model with arm movement predictability was implemented in the algorithm, which also corrects any predictions by measuring the applied force on a cloth at any given time. This state estimate prediction ensures that the elbow stays within a prescribed box.
Participants in the project wore “Xsens” suits, which collect data to train the machine learning systems, outfitted with sensors that track and log body movements. Once trained, the robot inferred the elbow orientation while putting the jacket on a human, who moved their arm around during this task. There were instances where the participant randomly moved on his own or in response to the robot tugging on the jacket. Now, the researchers are developing a robot that adjusts its movements as the elbow or arm changes position.
The team plans to work on the personalization issue, allowing their robot to take into account the distinctive ways a person moves. They also want to develop robots capable of working with varying cloth materials, each of which exhibits different responses to pulling.
Have a story tip? Message me at: http://twitter.com/Cabe_Atwell