Deechee and its team (via University of Hertfordshire)
Teaching one's child to speak and hearing their first words are some of the proudest moments for parents all over the world, even if your child is a robot. That’s exactly what researchers from the University of Hertfordshire set out to do with their iCub robot in an effort to show how language-learning evolves at the infant stage (human). The team, Dr. Caroline Lyon, Professor Chrystopher Nehaniv and Dr. Joe Saunders, were looking to understand the process of how infants (from the age of 6 to 14 mos.) traverse from simply ‘babbling’ syllables to complete words in order to learn how they start to make sense of language. So they decided to use an iCub robot (more on this in a bit) to acquire that understanding. In order to mimic human infants the team programmed the robot, named ‘DeeChee’, using a specially written algorithm that uses 40,000 English-language syllables in order for it to babble by stringing the syllables together. The team then appointed 34 adults to act as ‘parents’ (or teachers) and talk to the robot as though it were a child. The parents had 8 minutes each to spend talking with DeeChee, after which its memory was saved and erased in order to see what percentages of syllables were spoken by the robot. They found that DeeChee babbled the same amount for each syllable each time. They next programmed the robot to listen and then speak after the teachers spoke to it and found that the robot would speak the syllables the teachers had articulated. It was concluded that the robot was sensitive and reacted to certain sounds in much the same fashion as human children.
What were DeeChee’s first coherent words? Well, several, including red, green and box which, just like our human children, are no less impressive.
The robot that was used for this research is known as the iCub (Cognitive Universal Body) which is an open-source product of the RobotCub Consortium that’s comprised of a group of Universities in the European Union for study in a multitude of disciplines. The robot is designed to look like a human child approximately 3.5 years of age and uses a series of tendon-driven joints in conjunction with servos and Hall-effect sensors for motion that are controlled by an PC104 on-board computer board. The head is equipped with stereo cameras that are mounted on swivels for eyes, microphones located on both sides where the ears are located and a series of red LEDs to both mimic the mouth and eyebrows for facial expressions. The first iCub robot was only able to crawl but has since been upgraded with smaller hands and fingers as well as increased leg strength with stronger joint angles that permit the robot to walk, however, it does require ground-markers to navigate. One of the more intriguing demonstrations that the robot could accomplish was from the Italian Institute of Technology where a research team designed an algorithm called ARCHER (Augmented Reward Chained Regression) that enabled the iCub to learn archery and shoot an arrow at the target's center (I could think of better uses, but to each their own).
Cabe