menu
robots A-Zrobot timelinegamestv & moviesfeaturescontactfacebooktwitteryoutubetumblrrss feed
bg logo logobump

• Eva

This past weekend I watched Eva, a Spanish film that features an interesting depiction of robots in daily life.  It stars Daniel Bruhl (probably best known as the lovestruck German sniper in Taratino’s Inglorious Basterds) as Alex, a roboticist who returns to his hometown after a ten year absence to finish work on an android project.

This film is beautifully acted by Bruhl and the entire cast and has good (if not great) special effects for the robots.  In particular, Alex is provided a live-in android named Max (wonderfully played by Lluis Homar)and a nifty robot cat that follows him around.  Others that seem to be based on BigDog and droid-like service robots roam the background while cars zip around with an electrical hum.  Programming genuine people personalities seems a breeze thanks to an ornamental holographic display controlled with an unseen motion-capture device.

Ok, so things are about to get a bit spoilery so be forewarned.  The technology that is presented feels quite plausible with one exception; I doubt that we will have androids that can so easily pass for humans as Max in the next thirty years.  That said, the way the script inserts some of its sci-fi elements is half the fun.  Max’s emotional intelligence level can be changed depending on the mood of his master, since he is not a “free” robot (which is illegal in this world).  Early in, the film sets up an Asimov-like protocol for when A.I.s need to be shut down for good, which involves a simple spoken command.




Videos: NAO Dances, Real King Kizer Karate Chops & More

They may not be the smoothest robot dance moves we’ve seen, but this must have taken awhile to put together.  It’s a six-minute-long choreographed dance routine for NAO based on the viral video “The Evolution of Dance”.  Perhaps it should be titled the Humanoid Hussle?

Video:

YouTube Preview Image

[source: TheAmazel @ YouTube] via [ObiJerome @ Twitter]

TELESAR V Avatar Transfers Touch, Vibration, Temperature

The Japanese Science and Technology Agency (JST) and Keio University’s Tachi Lab presented their newest telexistence system, the TELESAR V, at SIGGRAPH 2012 Emerging Technologies.  The avatar system’s ability to relay tactile sensation has improved since its original reveal in November 2011.  In a world first, a human operator can now distinguish if the robot is handling cloth or paper by touch alone.

Of course, the operator gets much more than just tactile feedback as part of the experience, as a head-mounted display provides a high-definition wide-angle view from the robot’s perspective, along with stereo sound.  The robot avatar’s body has a total of 54 degrees of freedom (head x3, trunk x5, two arms x7, two hands (and fingers) x16) which allow it to accurately mimic the operator’s movements (using a motion-capture system).

Mind-Controlled Miniature Mechanical Manservants?

How would you like to control a robot using your thoughts alone?  A research program that brings together thirteen universities and institutes across Europe are using functional magnetic resonance imaging (fMRI) in an attempt to do just that. By detecting changes in the blood flow to various parts of the brain, the machine can detect thought patterns that are then used to control the robot.  It’s all part of the very sci-fi-sounding but very real Virtual Embodiment and Robotic Re-embodiment project.

So far the researchers are only descrambling enough signals to move a Fujitsu HOAP-3 (a miniature humanoid robot platform) left, right, and forward. The robot is self-balancing, and its walking gait has been programmed in advance, so it is essentially like hitting a switch to activate one of the three movements.  The system could be adapted to larger humanoids relatively easily.

The signals can be sent over long distances (in this case, from Israel to France) to control the robot with only minor lag. In turn the robot’s on board camera sends a first-person video to its operator, giving them a telepresence (or avatar-like) experience.

AndyVision: Retail in the Year 2020?

AndyVision wanders through the aisles of a store

Researchers at the Intel Science and Technology Center (Carnegie Mellon University) are working on an inventory-assistance robot called AndyVision.  The robot would assist retailers by keeping track of product shelves, noting when items need to be restocked and when something has been misplaced.  It would also guide both staff and customers to specific items on demand using its map of the store and knowledge of its products.  According to Professor Priya Narasimhan, the robot’s computer vision system is better than using RFID tags, which have to be attached to products by hand and which have trouble with metallic shelving.

By 2020, the group hopes to transform the retail experience with robots that can not only keep track of inventory, but also fold clothing items, restock the shelves, and help you bring your bags to the car.  The problem will be convincing stores to drop $20,000 ~ $30,000 (Mitsubishi’s Wakamaru, a comparable robot platform, retailed for around that) on something that may not always function as advertised.

Beijing Institute of Technology Unveils BHR-4 & BHR-5

A pair of BHR-5s can rally up to 200 times without error

Late last year researchers at Zhejiang University, China, unveiled a pair of ping-pong playing robots (see here).  Last week, rival researchers at the Beijing Institute of Technology fought back with the fourth and fifth generations of their humanoid robots at the “National Hi-Tech Industrial Development Zone” exhibition.

The BHR-4 is a departure from the others, being a realistic android that wears human clothing.  Based on one of the researchers, it has an animatronic face capable of expressing a variety of emotions on demand including surprise, fear, and happiness.  As a result of all the moving parts in its face (eyes, eyelids, eyebrows, mouth, and cheeks) this robot has a total of 43 degrees of freedom.  It stands 170 cm (5’7″) tall and weighs 65 kg (143 lbs).

And unlike the Geminoids built in Japan, this android has a fully-actuated body, allowing it to stand up and perform tai chi exercises.  It also participated in simple conversation with attendees thanks to speech recognition and speech synthesis.  When asked, “What do you like to eat?” it replied, “We robots do not need to eat.”

U of Arizona Researchers Build Bipedal Robot

Dr. M. Anthony Lewis, Director of the Robotics and Neural Systems Lab at the University of Arizona, and Theresa J. Klein (PhD student) have been working on a biarticulate muscle leg model.  In a paper published in 2008 (available at the lab’s website), they describe how motors pulled on stiff, tendon-like Kevlar straps to reproduce the action of key muscle groups.

Their new biped robot features an improved leg design that models even more muscles.  And it’s already walking (though it relies on a babywalker-like support for balance).  It stands 55 cm (22″) tall with the legs fully extended and weighs approximately 4.5 kg (10 lbs).

EveR-4 Appears at Expo 2012 Yeosu Korea

The Korean Institute of Industrial Technology’s (KITECH) Robotics Fusion Research Group has unveiled the fourth generation in its line of gynoids, known as EveR-4, at the robotics pavilion of Expo 2012 Yeosu Korea. Dr. Dong-Wook Lee (39) and a colleague have been developing the female androids since 2005, and suggests this version is capable of more realistic expressions thanks to a new artificial tongue.

EveR-4 stands 180 cm (5’11″) tall and was designed to look like a receptionist to give speeches and interact with people.  It attempts to replicate the complex assortment of muscles in the human head with no less than 30 motors, which Dr. Lee says is a world record.  He admits that there isn’t much demand for androids at the moment, but said that could change in the future as theme parks around the world adopt Disney’s automated approach.

The gynoid may also find work as a theater “actress” through synchronized voice performance, facial expressions, lip synch, and body gestures.  Previous models (and those built in Japan) are used to study not only human-robot interaction but the convergence of technology and the arts and humanities.  However, like most other androids it cannot stand up or walk under its own power, and its spoken lines would likely be performed by a human because artificial speech still leaves something to be desired.  Currently, only the HRP-4C developed by AIST uses Vocaloid software to produce a synthesized singing voice.

A small selection of photos follows after the break.

[sources: Hankyung, Daum (KR)]