menu
robots A-Zrobot timelinegamestv & moviesfeaturescontactfacebooktwitteryoutubetumblrrss feed
bg logo logobump



• IRT Humanoid

The University of Tokyo’s IRT (Information and Robot Technology) Research Initiative was started in 2008 and later that year a trio of robots were unveiled to the press, including the Assistant Robot, the Dish washing Robot, and Mamoru (a memory assistant).   The program’s goal is to conquer the problems facing Japan’s rapidly aging society through robotics and information technology.  The researchers have been fairly quiet about their progress, but have published some research papers starring a new robot.  While it may not be ready for prime time, it’s exciting to see what they have cooking.

The IRT Humanoid is a new full-scale bipedal humanoid with 38 degrees of freedom (neck x3, 2 arms x7, waist x1, 2 legs x6, fingers & toes x8).  It is equipped with a gyro sensor at its waist, 6-axis force sensors in its feet, and two additional force sensors in each arm.  It is controlled by a pair of computers on a network, each dedicated to the upper or lower body.  The robot appears to be covered in a soft latex foam rubber, and has paper covering incomplete areas like its head, hands, and feet.

Above: The IRT Humanoid practices its back swing.

It is being taught to mimic human motions through motion-capture at Tokyo University’s Department of Mechano-Informatics (Nakamura Lab), the birthplace of a smaller pair of humanoids called Mighty and Magnum.  Professor Yoshihiko Nakamura is not alone, having assembled an international team including Dr. Christian Ott (DLR), and two female professors, Prof. Dongheui Lee (TU Munich), and Prof. Dana Kulic (Ontario’s University of Waterloo).  In actuality, it’s not enough to simply capture a person’s movement and expect the motions to work properly on the robot.  Differences in the basic body structure and dynamics have to be accounted for, a problem previously identified when transferring motion-capture data to computer-generated characters (the solution is referred to as “motion retargeting”).

Even with these differences, human body movements can be retargeted and performed by the robot in almost real-time.  However, mimicry by itself is not as useful as learning motions in a more general sense, so the researchers are developing a library of “motion primitives” which can be used to automatically generate a larger number of similar motions.  Different categories of these motion primitives can be mixed together to generate new motions that haven’t been demonstrated by a person.  For now, the robot is programmed to ignore the instructor’s lower-body movements (to  help maintain its balance), and is not tracking details like individual fingers and toes, but in the future the team hopes to replicate human walking, side stepping, squatting, and interaction with the environment.

Several robotics labs are working under the umbrella of the IRT Research Initiative, and it’s been awhile since we’ve seen any new developments regarding their other robots.  Needless to say, we’re excited to see more.

All images and video are owned and copyright Yoshihiko Nakamura (University of Tokyo), Christian Ott (DLR), Dongheui Lee (TU Munich), and Dana Kulic (University of Waterloo).

[IRT Research Initiative website (JP/EN)] & [Nakamura Lab (JP)]

Media

Video (from the Int’l Journal of Robotics Research):

YouTube Preview Image

Images: