Earlier we took a look at how the HRP-2 Promet is learning to use objects in its environment for support rather than avoiding them. AIST’s Sebastien Lengagne, Abderrahmane Kheddar, and Eiichi Yoshida have now posted a video showing the HRP-2 Promet stepping onto and over a 15cm (6″) tall platform rather than resorting to navigating around it. This sort of path planning is obviously much more efficient, but could be problematic should the robot attempt to step onto something that is unable to support its weight. The current set-up requires that humans recreate physical objects as models in a computer simulation that checks for collisions between the robot, its environment, and the surrounding objects. In the future, it’s likely that 3D depth sensors will provide enough information to automatically generate model data for such objects, though the object would have to be scanned from a variety of positions to obtain a complete description.
[source: Lengagne2583 @ YouTube]