menu
robots A-Zrobot timelinegamestv & moviesfeaturescontactfacebooktwitteryoutubetumblrrss feed
bg logo logobump



• HOAP-2

Fujitsu Automation’s Humanoid for Open Architecture Platform (HOAP), saw its 2nd generation model released in August of 2003, two years after the original.  In 2004, HOAP-2 received the Technical Innovation Award from the Robotics Society of Japan.

At a height of 50cm (19.6″), and weighing 7kg (15.4 lbs), it was slightly taller and heavier than its predecessor and had smoother joint movements.  This model added a neck, waist, and hands with 5 fingers that could open and close to grasp small objects (such as chess pieces), increasing its total degrees of freedom from 20 to 25 (2 legs x6, 2 arms x4, 2 hands x1, torso x1, neck x2).  Two head options were available, with the box-like version containing two USB cameras for stereoscopic vision (15 fps at 320×240).  Its processor was the equivalent of a 700MHz Pentium 3 running the RT Linux OS and was programmable in open C / C++.  Like the SONY QRIO, it had four pressure sensors in the sole of each foot to detect its center of gravity.

The HOAP-2 continues to be the subject of a wide range of research topics in laboratories around the world, having been sold to approximately 70 universities.  A humanoid robot’s intelligence typically falls far below its body’s real capabilities, so labs have tried to expand it by using complex neural networks, central pattern generators, genetic algorithms, and motion-capture imitation.  Researchers at Osaka University and the HANDAI Frontier Research Center used it as their RoboCup platform.  ICRA 2011 saw presentations starring the robot (see kinesthetic teaching and shuffle translation).

One of the most active groups was the Learning Algorithms and Systems Laboratory at the Ecole Polytechnique Federale de Lausanne (EPFL).  Researchers there helped build the robot’s simulator (.PDF) and added speech recognition functionality.  It gained the ability to draw portraits in a human-like style based on camera images as well as write the alphabet.  It was also used in human-robot interaction studies where it has performed gesture recognition and imitation learning (programming by demonstration).  In other experiments, researchers controlled the robot’s arms using X-Sens body sensors as well as using two Nintendo Wii remotes to play a drum.

Video:

Media:

Comments are closed.