robots A-Zrobot timelinegamestv & moviesfeaturescontactfacebooktwitteryoutubetumblrrss feed
bg logo logobump

The ART Project’s Nuclear Inspection Centaur Robot

After the earthquake last year and the resulting damage to the Fukushima nuclear plant, observers criticized Japan’s lack of preparedness. In particular, many felt that the Japanese robotics sector’s focus on expensive humanoids had squandered time and resources better spent on more specialized robots.  However, this isn’t totally accurate.  The Japanese government, corporations, and universities have been working on robots for just this sort of problem for decades.  Back in the 1980s the Japanese government invested 20 billion JPY (still less than $100 million dollars at the time) into a massive eight-year program to build three types of advanced robots for hazardous environments.

The ART (Advanced Robotics Technology) Project had goals that were too big for any one institution to achieve, so a consortium called ARTRA (Advanced Robotics Technology Research Association) was formed. Financed and controlled by the Agency of Industrial Science and Technology, ARTRA brought two major government organizations, the Mechanical Engineering Laboratory (MEL; now known as AIST) and the Electrotechnical Laboratory (ETL), together with 18 corporations under the same banner, along with the support of academia.

The ART robots were designed for three major areas: nuclear plants, undersea oil rigs, and a third for disaster prevention in refineries.

The nuclear inspection robot would have a sensor head, four legs, two 7-DOF arms, and four-fingered hands with pressure sensitive finger tips (this configurarion led to it being known as the Centaur robot). It would be paired with a smaller, wall-climbing partner that used suction cups to adhere to wall surfaces. This would allow it to “climb up and down stairs, step over piping or other impediments, and relocate itself at high speed.”

It would have to work in 70 degrees (158 degrees Farenheit), 90% humidity, and 100 roentgens of radiation per hour. What started as a 1/3-scale model of the four-legged mechanism eventually became a robot measuring 188 cm (6’2″) tall, 127 cm (4’2″) long, and weighing 700 kg (1,543 lbs).

Video: Robonaut 2 Put To Work Aboard The ISS

NASA reports that Robonaut 2 began work aboard the International Space Station in mid-March of this year after being given the go-ahead by the crew and ground team.  Its assigned task: to check the air flow coming from vents inside the station. This particular job is normally done by the astronauts once every 90 days to ensure the vents haven’t gotten clogged.  According to NASA the measurements are sometimes difficult to obtain due to the zero gravity, and because an astronaut’s breath can affect the results.

Recently they published a video of the robot autonomously operating a control panel.  It has to recognize the panel’s array of buttons and switches and know how (and when) to interact with each of them.  The robot’s forearms have also been modified aboard the station with added heat sinks to allow it to perform longer.  These are small but important steps towards the realization of a robot that can perform tasks outside the comfort of the station.


YouTube Preview Image

[source: NASA] via [Robots Dreams] via [Engadget]


In what is almost certainly a world first, researchers Chung Changhyun and Motomu Nakashima at the Tokyo Institute of Technology have developed a robot that can faithfully reproduce a swimmer’s whole-body motion while measuring water resistance.  Called the SWUMANOID (Swimming Humanoid), its results are expected to be presented at the Aero Aqua-Biomechanisms Symposium (ISABMEC 2012) in Taiwan this August.

Although swimming is a popular sport, there’s still much to be discovered.  Normally researchers analyze video footage of a real swimmer, but the problem is repeatability.  The SWUMANOID can perform exactly how the researchers want it to, allowing them to repeat tests or make slight adjustments to better understand water resistance and propulsion.  Furthermore, the robot can wear a swimsuit to determine its impact – facilitating the development of performance-enhancing swimwear.

To create the robot, the researchers first performed a 3D body scan of a real person.  A 1/2 scale model was built using 3D printed parts.  The robot was then outfitted with 20 water-proof motors, and programmed the necessary motions to reproduce realistic crawl, breaststroke, backstroke, butterfly, and even dog paddling and treading water.

Due to its smaller size, the robot is slower than a real person.  It takes two minutes thirty-six seconds to swim a hundred meters.  In the future, the team would like to build a life-sized robot that will have even more degrees of freedom to better model real swimming.


YouTube Preview Image

[source: TITECH Nakashima Lab (JP)] via [MyNavi News (JP)

• ApriPetit

Toshiba’s R&D Center has unveiled a new version in their line of prototype household robots at ROBOMEC 2012 (May 27th~29th). The aptly-named ApriPetit is almost half the size of the previous prototype (see 2008′s ApriPoco), standing just 15 cm (5.9 inches) tall.  It’s so small that you can easily hold it in the palm of your hand, so Toshiba is describing it as a portable robotic interface.  That kind of close interaction has the side effect of facilitating better speech recognition.  It can banter with you using a combination of speech recognition and text-to-speech software.  Although the technology hasn’t been commercialized, the prototypes have found their way into a number of research labs.

At various times university researchers have toyed with adding mobility to the Apri line of robots (even adding serving trays to carry foodstuffs), but this one has been designed as a stationary tabletop robot.  It has three degrees of freedom (torso x1, head x2) allowing it to swivel and nod its head, giving it simple but endearing expressions.

Given that this is a communication robot (one possible use is in home care reminding the elderly to take their medication), eye contact is considered very important.  The ApriPetit’s enormous eyes contain functional stereo cameras, which can find and recognize faces and determine how close they are.  The ApriPoco used a distance sensor to help determine a person’s proximity, but this has not yet been implemented in ApriPetit (perhaps due to its compact form).  Other companies have explored similar devices, such as NEC’s PaPeRo-mini and RayTron’s Chapit.

Toshiba is considering possible applications including supplying the robot to cloud service providers, allowing various services to be piped to the home through the robot.  It’s great to see that despite some 10 years of research Toshiba hasn’t given up on their “Advanced Personal Robots with Intelligence”.  However, I must admit that I prefer ApriPoco’s overall shape and its cute little arms, which together make it look a bit like a baby turtle.

[source: Robonable (JP)]

ICRA 2012 Plenary Session: Outline of HUBO 2

Note: there are currently 7 HUBO 2s in the USA

At ICRA 2012 Dr. Jun Ho Oh (KAIST, Hubo Lab) gave a revealing presentation on the development of his full-size humanoid robots. The latest, HUBO 2, represents 10 years of research and is now commercially available to anyone with $400,000 to spare (around a dozen units have already been sold to universities in the United States, Singapore, China, and South Korea). It’s great to sneak a peek inside his lab, when so many other humanoid projects are so secretive.  It can be viewed in its entirety (about an hour) courtesy of, but we thought we’d break it down into some of the more exciting bits for you here.

Dr. Oh states up front that his work is to build the hardware, not the software, of humanoid robots. This somewhat lopsided approach is something that many Western pundits have criticized, but to me it only seems natural (if not essential) if you plan to build a highly competent platform like HUBO 2.  Software, such as speech recognition, can be quickly added to the robot’s repertoire since (for the time being) it offloads higher-level processes to external computers.  He begins with an overview of DARPA’s recently announced robotics challenge (perhaps suggesting HUBO 2 could participate) before briefly covering its evolution stemming from the KHR-1, KHR-2, Albert HUBO, and HUBO FX-1.

(It should be noted that, contrary to Dr. Oh’s offhand remark, the KHR-1 was not the first full-sized walking humanoid built outside of Japan: China’s Xianxingzhe (2000) preceded it, and Technical University of Munich’s JOHNNIE was developed concurrently – there may also be others that aren’t coming to mind)

The locations of the elbow and knee joints are noteworthy, having been offset by a few more degrees than is typical. They look slightly odd as a result, but their location allows the limbs to bend almost as much as their elastic human ancestors. This flexibility means HUBO 2 can bend down and pick up objects from the floor without much difficulty. It is also revealed that the legs have multiple motors per joint for added power.

DARPA’s ARM Program Enters Phase 2

Back in 2010 DARPA announced the ARM (Autonomous Robotic Manipulation) Program, which has the ambitious goal of solving a number of complex grasping and manipulation challenges.  Plastic Pals was one of the first websites to report on this program (read more about ARM’s objectives in our original article here).

The four year program has entered its second phase, having moved from a single-armed robot to a dual-armed version built by RE2 using components by Barrett (7-DOF WAM arms, force-torque sensors in its wrists, and three-fingered hands with pressure sensitivity).  The ARM Robot (affectionately called “Robbie” despite the popularity of “Oliver” in an online vote) has a face only a mother could love, containing a BumbleBee2 stereo camera, a Prosilica high resolution camera, an SR4000 Swiss Ranger infrared camera, and microphones.

Its next task will be to change a tire on a small car; whether that means a Mini Cooper or some sort of scale model is unclear.  By the end of the program, DARPA hopes the robot will be capable of executing these sorts of tasks autonomously with humans verbally commanding the robot to do what they want.  In the original program brief, actions included the unpinning and tossing of a grenade, so things are bound to get exciting.  Whether or not DARPA plans to combine these results with those of their other humanoid robotics challenge (which has a similar deadline) is still unknown.

Currently the robot appears to rely on special markers to find and recognize objects, but in the future it won’t have such conveniences.  For now, it seems the ARM robot still lags behind other robots as it reinvents the wheel.  Take, for example, the similarly named ARMAR-III (developed at the University of Karlsruhe, Germany), which is able to find and grasp a wide assortment of household objects inside of a mock kitchen.  It analyzes and solves complex manipulation tasks (e.g. loading and unloading a dishwasher) using its OpenGrasp software toolkit.  And unlike the stationary Robbie with its gangly Barrett components, ARMAR-III is fully mobile and has more human-like proportions and hands.

Watch IEEE Spectrum get up close and personal with Robbie in the following video:

YouTube Preview Image

[source: ARM Project] via [IEEE Spectrum]

• MH-2 Wearable Communication Robot

Our friends at IEEE Spectrum continue to cover ICRA 2012, and they’ve scored another scoop on one of the projects we were most excited to see.  The MH-2 (Mini Humanoid) Wearable Communication Robot was presented by Yuichi Tsumaki, Fumiaki Ono, and Taisuke Tsukuda of Yamagata University’s Telerobotics Lab.

The Telerobotics Lab has played with the idea of wearable humanoid robots for the past decade.  Earlier we looked at their T1 Telecommunicator, which featured a very simple humanoid that sat on your shoulder and could wave its arms and look around.  The idea is that the carrier takes the robot avatar (and by proxy, its operator) with them where ever they go, providing them a form of tele-existence.  Others have explored similar ideas, such as ATR’s Elfoid, and MIT’s MeBot, but those are more like a cellphone that you would call than a robot avatar that is constantly on.

The operator of the robot controls its actions through a simple motion-capture set-up (using a Kinect sensor, for example), or through a GUI on a computer.  And the MH-2 is equipped with a stereo camera rig that allows the operator to see the carrier’s surroundings in immersive 3D while carrying on conversations through a microphone and speaker.  Let’s say you want to take a distant or bedridden friend on a tour: they can “jack in” to the robot and experience your surroundings as you show them around.


YouTube Preview Image

As you can see in the video, the MH-2 is capable of much more sophisticated movements than comparable robots thanks to its 20 degrees of freedom, allowing for more natural body language to be conveyed through it’s diminutive upper-body.  Each arm has 7 degrees of freedom (including a nifty 3-DOF wrist), 3 in its neck, 2 in its body, and (rather remarkably for a robot) an extra DOF that causes its chest to look as though it is inhaling and exhaling.

Similar to Samsung’s miniature humanoid April, the MH-2 offloads the bulk of its 22 actuators through a bundle of wires.  They lead to a backpack worn by the carrier which looks a bit cumbersome, but seems a bit less so than the T1′s configuration.  Currently the robot relies on a human carrier, but it would be equally at home on its own mobile base.  In that case, the operator would take control of navigation and the robot would function as a standard telepresence robot rather than a wearable one.

[source: Tsumaki Telerobotics Lab (JP)] via [IEEE Spectrum]

China’s First Self-Balancing Unicycle Robot

Back in November 2011 Prof. Xiao-Gang Ruan and six graduate students from the Artificial Intelligence and Robotics Research Institute of the Beijing University of Technogy demonstrated a cute self-balancing unicycle robot at the Beijing museum.  It was one of a handful of student projects on display at the Creative Bazaar during the sixth annual Beijing International Cultural and Creative Industry Expo.  Many of the projects on display, such as single-passenger electric vehicles, emphasized environmental concerns.  Other projects included a saucer-shaped drone aircraft, and art installations.

Although it is not the only self-balancing unicycle robot in the world (see Murata Girl), it is a first for China.  The robot is capable of balancing at a fixed point, can move forward and backward, and can compensate for external disturbances.  Its developers say the robot can also navigate a balance beam, but this feat was not part of its public demonstration.

For now it’s purely an entertainment robot, but it could be used as a guide, to transport goods, or perform security patrols.  The look of the robot was produced in collaboration with Spring Design, a company which produces product designs for multinational corporations like Panasonic (see some conceptual renderings after the break).

[sources: Beijing Today (CN)] & [CNWest (CN)] & [Xinhua (CN)]