Cornell University’s PR2 robot is being trained to understand natural communication (via Cornell University)
Robots perform their functions using specific software written for certain tasks. Some of them are able to understand rudimentary spoken word commands such as ‘rotate arm 30-degrees’ or ‘open and close left hand’. They usually are incapable of following commands given to them using ‘natural language’ or language that arises unpremeditated in the brains of human beings. Instead of voicing commands that the robot turn 1800, move forward 15-feet, extend its right arm 14-centimeters, clasp the refrigerator door handle, retract its arm 14-centimeters, use its other arm and retrieve soda, suppose you could simply say ‘get me a pop’?
The problem is the language barrier that exists between robots and humans. Scientists, however, have been researching the issues related to Natural Language Processing (NLP) since the 1950’s. Cornell University researchers (from the Robotics Learning Lab) are developing their ‘Tell Me Dave’ project that would allow robots to learn natural language commands. The project makes use of a PR2 robot from the now dissolved Willow Garage and builds on Cornell University’s previous research into teaching robots how to identify surrounding people’s activities through motion as well as identifying an object in its location. Those programs were essentially used to refine the robot’s understanding of objects through visual and non-visual data.
The PR2 robot is outfitted with a 3D camera and software suites from the previous projects to help identify objects and determine what their uses are. For example, the robot can identify a bowl sitting on a counter and knows that things can be filled or poured out of it, such as cereal or ice cream. Based on the verbal command, it would then know that a spoon or fork could be used with it as well. Therefore, instead of a lengthy program script or simple verbal commands, users could tell the robot to make them some ramen noodles and after identifying the objects in its surroundings, it could begin cooking the tasty noodles.
The interesting part is that objects can be placed randomly or taken away and the robot would augment its program routines to adapt to what is available to make those noodles. If users say ‘boil some water’ it will look for what is available, like a saucepan and stove or a bowl and a microwave. Of course, natural human language can be vague in the best of circumstances, so the researchers are developing an algorithm that identifies key words and associates them with objects in its surroundings.
The robot then compares that data to previous data learned in a virtual environment to get a better understanding of what the user wants. To that end, the robot still isn’t able to grasp every verbal command it’s given and only performs the requested tasks about 64% of the time; however it’s still a good milestone in the teaching of natural language commands. As part of the ongoing Tell Me Dave learning project, the researchers invite those with programming knowledge to write their own scripts for simulated robots in a virtual kitchen environment. The crowd-sourced programs will then become part of a library of instructions that will be incorporated into the PR2 and other robots in the future.
C
See more news at:
http://twitter.com/Cabe_Atwell