AIM Lab’s earlier robots Ami and Amiet, which were created in 2001 and 2002 as a test-bed for human-robot interaction, eventually led to the creation of a true bipedal humanoid called Amio in 2006. Amio has 36 DOFs (neck x2, 2 arms x5, 2 hands x6, 2 legs x6), stands 150cm tall (just shy of 5 feet), and weighs 45kg (99 lbs). It can operate for up to 30 minutes on its built-in Lithium-Polymer battery, and has a walking speed of 1km/h. It has 2 CCD cameras for vision, 8 inclination sensors throughout his body, and 4 FSRs (force sensing resistors) per foot to locate the ZMP (zero moment point).
Like his predecessors, Amio’s speech and vision recognition software allow him to guess a person’s emotional state, but his fully anthropomorphic shape is more ideal for human-robot interactions. The strength of the software has been proven in several experiments, where the robots chose an appropriate conversation topic and behaved appropriately in response to human emotions. They could ask you what you are angry about and then make a joke to console you or make you laugh. Additionally, AMIO expresses its own emotional state through a 3D virtual face displayed on its chest’s LCD screen.
AMIO has reproduced human motions from demonstrations using an inexpensive motion-capture system that was made using wearable sensors. Using a head-mounted display, the teleoperator could look through the robot’s eyes. This work was titled “Developing New Abilities for Humanoid Robots with a Wearable Interface” by Hyun Seung Yang, Il Woong Jeong, Yeong Nam Chae, Gi Il Kwon, and Yong-Ho Seo (KAIST AIM Lab).
The researchers at KAIST’s AIM lab intend to further develop these technologies in order to improve human-robot interactions in the future. Besides expanding the robots’ emotional memory, they will also be upgraded to include a wider range of conversational topics, which will be stored on a network.
KAIST AIM Lab