Researchers at Kyushu University decided to modify their resident Fujitsu HOAP-2, and renamed it PICO (proactive interface robot). They added an entirely new head which increased its height to 64cm (25″) and weight to 8.7kg (19 lbs), containing a camera, a small LCD screen, speaker, and microphone. They also rounded out its metallic body with attractive plastic shells.
PICO-2 became one of the few whole-body telepresence robots in existence. The robot’s helmet visor can be retracted, allowing the LCD screen to display the user’s face, and its body reproduces the operator’s gestures or even walking and dancing. This research was done in 2005, so they didn’t have the benefit of a Kinect sensor, but the general idea is similar and works almost as well.
First, the user’s body position is extracted from its silhouette in images taken with the monocular camera. It then reconstructs the positions of the arms and legs in simulation using a digital avatar. Finally, the motions of the avatar are sent to be reproduced on the robot in less than 30 milliseconds, allowing non-verbal communication to happen in (almost) real-time.
Ryo Kurazume and Tsutomu Hasegawa also wanted PICO-2 to perform straight-legged walking. Bipedal robots often walk with their knees bent, lowering their center of gravity to keep it as steady as possible. This is usually because the joints have a limited range of motion, and helps to maintain balance. The problem is this looks rather unnatural to us humans (some have described ASIMO’s posture as looking like it needs to go to the bathroom).
Surprisingly, only a few examples of straight-legged walking exists in robots (Chroino by ROBO-GARAGE and WABIAN-2 by Waseda University are two other examples). Walking with straight legs doesn’t just look more natural; it actually improves the overall efficiency since less torque and energy consumption is required. After some simulations they applied their strategy to the robot, and found that it could walk with straight legs and the energy consumption was cut roughly in half.
Watch videos of their experiments and see a few more photos after the break!
Information and images from “Embodied Proactive Human Interface “PICO-2”” and “Natural Motion Generation of Proactive Human Interface PICO-2” by Ryo Kurazame, Tsutomu Hasegawa, Hiroaki Omasa, Seiichi Uchida, and Rinichiro Taniguchi from the Graduate School of Information Science and Electrical Engineering, Kyushu University.
Video (Straight-legged walking):