In 2001, the AIM (Artificial Intelligence & Media) lab at KAIST developed a humanoid robot called Ami (an anagram of the lab’s initials, as well as the French word for friend), which was able to move around using a wheeled base as an outgrowth of their earlier mobile robot platforms developed throughout the 1990s.
Ami is a rather large robot standing 160cm tall and weighing a hefty 100kg. His face went through a number of revisions: one had just an LCD screen for eyes; another had eyes but no mouth; and finally Ami got a mouth which moves when he speaks, but otherwise has limited expressive capability.
This proved problematic since the researchers intended Ami to be used as a testbed for emotion recognition and expression, dialog interaction, and sociability with humans as a form of robot therapy. While these concepts may sound strange, the concept of emotion has increasingly been used in interface and robot design, as demonstrated by MIT’s Kismet & Leonardo, and many others.
Ami was programmed with speech recognition and had rudimentary A.I. that could select appropriate responses based on your conversational tone. For example, if you were asked,”How are you feeling today?” and your response was,”I’m not feeling well.” then Ami would try to cheer you up.
In order to better communicate Ami’s emotions, the researchers chose to represent them using 3D graphics on an LCD screen implanted in Ami’s chest area. The software was intended to make up for the lack of expressiveness of Ami’s mechanical face. Ami made several appearances on Korean television and even met prominent political leaders.
KAIST AIM Lab