It’s been a few months since we caught our first glimpse of Diego-san, a new baby robot being developed by UC San Diego, Kokoro Co. Ltd., and Hanson Robotics. I contacted Dr. Javier Movellan, at UCSD’s Machine Perception Lab, and he was kind enough to answer my questions about the project. Dr. Movellan has been at the forefront of social robotics for many years, having spent almost 20 years working with Dr. Marian Bartlett and Dr. Gwen Littlefort on lip reading, face recognition, and mood detection. The Machine Perception Lab has developed one of the world’s best real-time facial expression recognizers. He has also developed the RUBI robot that interacted with 18 month old children, and worked with the SONY QRIO as one of the heads of the RUBI Project.
Plastic Pals: Could you give us an overview of the Diego-san project and its intended goals?
Dr. Javier Movellan: This is a project funded by the National Science Foundation. Its main goal is to try and understand the development of sensory motor intelligence from a computational point of view. It brings together researchers in developmental psychology, machine learning, neuroscience, computer vision and robotics. Basically we are trying to understand the computational problems that a baby’s brain faces when learning to move its own body and use it to interact with the physical and social worlds.
Diego-san was developed to approximate the complexity of a human body, including the use of actuators that have similar dynamics to that of human muscles. We are doing motion capture to understand how infants learn to reach, point, catch objects, and how moms contribute to the learning process. Engineers have traditionally simplified the robot control problem by using stiff actuators with high gear ratios; this decouples the dynamics and thus one can ignore centripetal and coriollis forces, by using actuators with very short time constants, and by reducing the number of degrees of freedom. This is not the approach the brain uses: the number of degrees of freedom is very large, the actuators have long time constants, and they are compliant. We feel there is a lot to be learned by throwing ourselves into the problem of a robot (DiegoSan) that has been designed to be very difficult to control for traditional control approaches.
We already know that the robot is around 130cm (4’3″) tall and weighs approximately 30kg (66 lbs), but what are Diego-san’s specifications?
The face is still under construction, the plan is for it to have 27 degrees of freedom. The body has 44 joints and there are 2 degrees of freedom per joint, i.e., each joint has an agonistic and antagonistic actuator, thus allowing for compliance control. The actuators are pneumatic pistons controlled using independent proportional control valves for each chamber of each piston. Each joint has a potentiometer to encode joint angle and two pressure sensors, one per piston chamber. The eyes have synchronized cameras, the ears have a microphone, and there is a loudspeaker in the throat.
The control hardware is based on standard National Instruments cards and we are developing our own control software.
Sounds somewhat similar to the CB2 robot. How did you come to work with Kokoro Co. Ltd. on the robot? Was this in part motivated by the robots of Dr. Ishiguro (Geminoids)? And what is it like working with Kokoro?
I visited Ishiguro’s lab back in 2004 for several months. I was very impressed by Kokoro’s expertise building pneumatics robots. Working with Kokoro has been an amazing partnership. Their craftsmanship is impressive. The robot mechanics are really a piece of art, that some day should go to a museum. They are pushing the limits of this technology. A good example are the hands, which are quite tricky to design using pneumatics. We ran into problems but they never gave up. They kept outsmarting themselves and redesigning the hands from scratch until we found a design that works really well. They were also willing to partner with other companies that have complementary expertise, like Hanson Robotics. So far it has been a great adventure.
How would you compare your robot and its research goals with something like the iCub? Was the iCub ever a potential platform for this research?
We have very similar goals to the iCub program in the EU and the Asada program in Japan. We did indeed consider the iCub as a platform but felt that a very important issue for us was to have pneumatic actuators, so we could approximate the compliance of the human body and could address the issue of compliance control. DiegoSan is based on the IC2 robot from the Asada program, you can think of it as the second generation of that robot. The experience with CB2 helped us make changes that improved upon the original design and fit our goals better.
This year there have been a slew of cognitive robotics projects unveiled, including robots such as M3-Neony, and Noby, which are also designed to study brain development. Has there been any “cross-pollination” between your lab and those of Dr. Ishiguro or Dr. Asada (Osaka University) or other universities?
Definitely. Actually a very important point of contact is the International Conference on Development and Learning (ICDL) which typically brings us together to discuss progress on developmental robotics. Last year I had researchers from the Asada lab working with us for several months. I also regularly send students to Ishiguro’s lab.
Those projects, as well as development being done on the iCub, are beginning to include new technologies like sensitive skin. I imagine the sense of touch is as important, if not more so, than other senses. Are there any plans of working with this sort of technology on Diego-san?
Yes, touch is a critical technology. Eventually we will incorporate it in Diego-san but for now we felt that our limited budget was better spent on other issues, like using air pressure sensors so we can better control the robot and realistic facial expressions, so we can better understand the interplay between the development of social and physical skills.
You mentioned Hanson Robotics (see Albert HUBO) is working on the project – in what capacity?
Yes. David Hanson is developing the external appearance of the face and the facial expressions.
Interesting. When can we expect you to publish your findings?
The robot is not finished but we are already starting to publish things. So far we have published work on system identification and control of the robot, computational analysis of social interaction between infants and caregivers, and systems for Diego san to learn on its own to move its head so as to find objects of interest and to make his own facial expressions.
As is often the case with realistic humanoid robots, general reaction to the first few photos of the robot was somewhat negative, prompting you to respond, and you even suggested you might involve input from readers of blogs. I don’t feel you should change your approach due to a few negative comments, though I might recommend covering the body’s mechanisms so that the body and head are in proportion to one another.
The pictures were taken as part of my visit to Kokoro in Tokyo to inspect progress. They were not pictures of the finished prototype. The pictures did not show the external shell (which is being developed here at UCSD), the final face, and the final hands. However the very strong negative reaction from the blogs was useful and we did listen to it. When people write in blogs they feel more free to be like babies and write their first unadulterated impressions. These first impressions are important and very valuable.
(laughs) Ain’t that the truth. Although from my experience it doesn’t seem to matter what a robot looks like since there are always the usual sort of negative comments. Even the cute, toy-like robots like Murata Boy (the bicycling robot) are called “creepy”. So when can we expect to see the finished robot?
We are having a workshop on November 10, which will bring together the group in Miami working on motion capture of infant-mother interaction, our group, Kokoro, and Hanson robotics. If all goes well we are planning to have a press release by then.
Sounds great! That means there’s at least one more major humanoid robot project announcement to look forward to before the end of the year. Thank you for very much for your time!
To learn more about the research being done at UCSD’s Machine Perception Lab, and the Institute for Neural Computation, visit their official websites. And if you can set aside an hour and a half, I would highly recommend you watch Dr. Movellan’s fascinating lecture on the RUBI Project, which starred the SONY QRIO intermingling with toddlers.
Video (or watch it at The Science Network):