It has been awhile since we first learned of Myon, a cyclopsian humanoid built for Humboldt University, Germany in 2010. It is the main robot being used in the multinational ALEAR project (Artificial Language Evolution on Autonomous Robots), which explores how “complex grammatical systems and behaviors can emerge in populations of robotic agents.” The project was recently featured in a BBC Horizon special called The Hunt for AI.
How language emerges and evolves is a fascinating subject, made all the more interesting by experiments where robots build a shared vocabulary from the ground up. They do it in much the same way our ancient ancestors must have, by naming the actions they perform as well as the things around them. Experiments play out like a game where a teacher and observer interact.
That’s Dr. Luc Steels in the clip, explaining how one of the robots is attempting to communicate its chosen word for a specific gesture. The “words” they invent begin as random sounds given to a specific action, object, or event. That coupling must then be successfully conveyed to a partner, which involves the observer guessing what the teacher meant. Whenever the observer correctly guesses the word’s definition, it enters into a shared vocabulary that can be used to study further complexities like grammar and tense (do this, then that).
If the project is a success, not only will robots be able to teach one another new words, but it will be possible for people to teach robots words in the same way we do infants. And the grammatical problems that often stump computers in Turing tests may be solved.
Previously, Dr. Steels has used just a pair of cameras combined with machine learning and computer vision software to study the same topic. Later, he did more complex language acquisition experiments with the SONY QRIO at the SONY Computer Science Lab (we reported on it in more detail here). The ALEAR project is an extension of this earlier work.