menu
robots A-Zrobot timelinegamestv & moviesfeaturescontactfacebooktwitteryoutubetumblrrss feed
bg logo logobump

Plastic Pals Turns 3 Years Old Today!

Today is the third anniversary of the site, and we’d like to thank you for reading! We hope the site meets your expectations.  Last year we did a robot figure give-away, which we’d like to do again, but it may have to wait a little while. We’ll announce that on our Facebook, Twitter, and Tumblr before the end of the year, so keep your eyes peeled!

To date we have added 516 robots to the site (give or take), and there’s plenty more where that came from! There are still dozens of older robots we plan to add to the site, which will help complete the robot timeline.  Some of them, like the SONY SDR-3X and QRIO, haven’t been added because we have so much media to sort through and organize.

Besides a new logo (do you like it?) we’re also thinking of changing our permalink structure.  The problem is that changing this will break the existing urls.  I’m not sure there is an easy way to do fix that.  It’s one of those things we didn’t know any better back when we started the site that has bothered us since.  In the future, we may also change the name of the website. When we started it three years ago, it was (and still is) very difficult to find a relevant domain name (Plastic Pals was not our first choice).  Unfortunately lots of great domains are simply registered with the intent to sell them later.  Got any ideas for a new name?  Send them to us via our contact page!




BBC Horizon Features The ALEAR Project

It has been awhile since we first learned of Myon, a cyclopsian humanoid built for Humboldt University, Germany in 2010. It is the main robot being used in the multinational ALEAR project (Artificial Language Evolution on Autonomous Robots), which explores how “complex grammatical systems and behaviors can emerge in populations of robotic agents.” The project was recently featured in a BBC Horizon special called The Hunt for AI.

How language emerges and evolves is a fascinating subject, made all the more interesting by experiments where robots build a shared vocabulary from the ground up. They do it in much the same way our ancient ancestors must have, by naming the actions they perform as well as the things around them. Experiments play out like a game where a teacher and observer interact.

Conceptualizing The DARPA Robotics Challenge

The upcoming DARPA Robotics Challenge isn’t just starring Boston Dynamics’ PETMAN robot. A small selection of teams from around the world will be funded to the tune of $2M USD over several years to build their own custom platforms for the challenge. The deadline for proposals was in late May, and the teams are now learning whether or not they made the cut.

Unfortunately the European team led by Icarus Technology‘s Davide Faconti wasn’t selected.  Having already completed two impressive full-sized humanoid robots called REEM-A and REEM-B, Davide was in a much better position than many teams from the United States to build a working robot for the challenge. I had full confidence in him and wanted to lend a hand if I could.  With his permission, I am now able to share the conceptual work I did for his team’s proposal.

• Eva

This past weekend I watched Eva, a Spanish film that features an interesting depiction of robots in daily life.  It stars Daniel Bruhl (probably best known as the lovestruck German sniper in Taratino’s Inglorious Basterds) as Alex, a roboticist who returns to his hometown after a ten year absence to finish work on an android project.

This film is beautifully acted by Bruhl and the entire cast and has good (if not great) special effects for the robots.  In particular, Alex is provided a live-in android named Max (wonderfully played by Lluis Homar)and a nifty robot cat that follows him around.  Others that seem to be based on BigDog and droid-like service robots roam the background while cars zip around with an electrical hum.  Programming genuine people personalities seems a breeze thanks to an ornamental holographic display controlled with an unseen motion-capture device.

Ok, so things are about to get a bit spoilery so be forewarned.  The technology that is presented feels quite plausible with one exception; I doubt that we will have androids that can so easily pass for humans as Max in the next thirty years.  That said, the way the script inserts some of its sci-fi elements is half the fun.  Max’s emotional intelligence level can be changed depending on the mood of his master, since he is not a “free” robot (which is illegal in this world).  Early in, the film sets up an Asimov-like protocol for when A.I.s need to be shut down for good, which involves a simple spoken command.

Videos: NAO Dances, Real King Kizer Karate Chops & More

They may not be the smoothest robot dance moves we’ve seen, but this must have taken awhile to put together.  It’s a six-minute-long choreographed dance routine for NAO based on the viral video “The Evolution of Dance”.  Perhaps it should be titled the Humanoid Hussle?

Video:

YouTube Preview Image

[source: TheAmazel @ YouTube] via [ObiJerome @ Twitter]

TELESAR V Avatar Transfers Touch, Vibration, Temperature

The Japanese Science and Technology Agency (JST) and Keio University’s Tachi Lab presented their newest telexistence system, the TELESAR V, at SIGGRAPH 2012 Emerging Technologies.  The avatar system’s ability to relay tactile sensation has improved since its original reveal in November 2011.  In a world first, a human operator can now distinguish if the robot is handling cloth or paper by touch alone.

Of course, the operator gets much more than just tactile feedback as part of the experience, as a head-mounted display provides a high-definition wide-angle view from the robot’s perspective, along with stereo sound.  The robot avatar’s body has a total of 54 degrees of freedom (head x3, trunk x5, two arms x7, two hands (and fingers) x16) which allow it to accurately mimic the operator’s movements (using a motion-capture system).

Mind-Controlled Miniature Mechanical Manservants?

How would you like to control a robot using your thoughts alone?  A research program that brings together thirteen universities and institutes across Europe are using functional magnetic resonance imaging (fMRI) in an attempt to do just that. By detecting changes in the blood flow to various parts of the brain, the machine can detect thought patterns that are then used to control the robot.  It’s all part of the very sci-fi-sounding but very real Virtual Embodiment and Robotic Re-embodiment project.

So far the researchers are only descrambling enough signals to move a Fujitsu HOAP-3 (a miniature humanoid robot platform) left, right, and forward. The robot is self-balancing, and its walking gait has been programmed in advance, so it is essentially like hitting a switch to activate one of the three movements.  The system could be adapted to larger humanoids relatively easily.

The signals can be sent over long distances (in this case, from Israel to France) to control the robot with only minor lag. In turn the robot’s on board camera sends a first-person video to its operator, giving them a telepresence (or avatar-like) experience.

AndyVision: Retail in the Year 2020?

AndyVision wanders through the aisles of a store

Researchers at the Intel Science and Technology Center (Carnegie Mellon University) are working on an inventory-assistance robot called AndyVision.  The robot would assist retailers by keeping track of product shelves, noting when items need to be restocked and when something has been misplaced.  It would also guide both staff and customers to specific items on demand using its map of the store and knowledge of its products.  According to Professor Priya Narasimhan, the robot’s computer vision system is better than using RFID tags, which have to be attached to products by hand and which have trouble with metallic shelving.

By 2020, the group hopes to transform the retail experience with robots that can not only keep track of inventory, but also fold clothing items, restock the shelves, and help you bring your bags to the car.  The problem will be convincing stores to drop $20,000 ~ $30,000 (Mitsubishi’s Wakamaru, a comparable robot platform, retailed for around that) on something that may not always function as advertised.