menu
robots A-Zrobot timelinegamestv & moviesfeaturescontactfacebooktwitteryoutubetumblrrss feed
bg logo logobump

TELESAR V Avatar Transfers Touch, Vibration, Temperature

The Japanese Science and Technology Agency (JST) and Keio University’s Tachi Lab presented their newest telexistence system, the TELESAR V, at SIGGRAPH 2012 Emerging Technologies.  The avatar system’s ability to relay tactile sensation has improved since its original reveal in November 2011.  In a world first, a human operator can now distinguish if the robot is handling cloth or paper by touch alone.

Of course, the operator gets much more than just tactile feedback as part of the experience, as a head-mounted display provides a high-definition wide-angle view from the robot’s perspective, along with stereo sound.  The robot avatar’s body has a total of 54 degrees of freedom (head x3, trunk x5, two arms x7, two hands (and fingers) x16) which allow it to accurately mimic the operator’s movements (using a motion-capture system).




Mind-Controlled Miniature Mechanical Manservants?

How would you like to control a robot using your thoughts alone?  A research program that brings together thirteen universities and institutes across Europe are using functional magnetic resonance imaging (fMRI) in an attempt to do just that. By detecting changes in the blood flow to various parts of the brain, the machine can detect thought patterns that are then used to control the robot.  It’s all part of the very sci-fi-sounding but very real Virtual Embodiment and Robotic Re-embodiment project.

So far the researchers are only descrambling enough signals to move a Fujitsu HOAP-3 (a miniature humanoid robot platform) left, right, and forward. The robot is self-balancing, and its walking gait has been programmed in advance, so it is essentially like hitting a switch to activate one of the three movements.  The system could be adapted to larger humanoids relatively easily.

The signals can be sent over long distances (in this case, from Israel to France) to control the robot with only minor lag. In turn the robot’s on board camera sends a first-person video to its operator, giving them a telepresence (or avatar-like) experience.

AndyVision: Retail in the Year 2020?

AndyVision wanders through the aisles of a store

Researchers at the Intel Science and Technology Center (Carnegie Mellon University) are working on an inventory-assistance robot called AndyVision.  The robot would assist retailers by keeping track of product shelves, noting when items need to be restocked and when something has been misplaced.  It would also guide both staff and customers to specific items on demand using its map of the store and knowledge of its products.  According to Professor Priya Narasimhan, the robot’s computer vision system is better than using RFID tags, which have to be attached to products by hand and which have trouble with metallic shelving.

By 2020, the group hopes to transform the retail experience with robots that can not only keep track of inventory, but also fold clothing items, restock the shelves, and help you bring your bags to the car.  The problem will be convincing stores to drop $20,000 ~ $30,000 (Mitsubishi’s Wakamaru, a comparable robot platform, retailed for around that) on something that may not always function as advertised.

Beijing Institute of Technology Unveils BHR-4 & BHR-5

A pair of BHR-5s can rally up to 200 times without error

Late last year researchers at Zhejiang University, China, unveiled a pair of ping-pong playing robots (see here).  Last week, rival researchers at the Beijing Institute of Technology fought back with the fourth and fifth generations of their humanoid robots at the “National Hi-Tech Industrial Development Zone” exhibition.

The BHR-4 is a departure from the others, being a realistic android that wears human clothing.  Based on one of the researchers, it has an animatronic face capable of expressing a variety of emotions on demand including surprise, fear, and happiness.  As a result of all the moving parts in its face (eyes, eyelids, eyebrows, mouth, and cheeks) this robot has a total of 43 degrees of freedom.  It stands 170 cm (5’7″) tall and weighs 65 kg (143 lbs).

And unlike the Geminoids built in Japan, this android has a fully-actuated body, allowing it to stand up and perform tai chi exercises.  It also participated in simple conversation with attendees thanks to speech recognition and speech synthesis.  When asked, “What do you like to eat?” it replied, “We robots do not need to eat.”

U of Arizona Researchers Build Bipedal Robot

Dr. M. Anthony Lewis, Director of the Robotics and Neural Systems Lab at the University of Arizona, and Theresa J. Klein (PhD student) have been working on a biarticulate muscle leg model.  In a paper published in 2008 (available at the lab’s website), they describe how motors pulled on stiff, tendon-like Kevlar straps to reproduce the action of key muscle groups.

Their new biped robot features an improved leg design that models even more muscles.  And it’s already walking (though it relies on a babywalker-like support for balance).  It stands 55 cm (22″) tall with the legs fully extended and weighs approximately 4.5 kg (10 lbs).

EveR-4 Appears at Expo 2012 Yeosu Korea

The Korean Institute of Industrial Technology’s (KITECH) Robotics Fusion Research Group has unveiled the fourth generation in its line of gynoids, known as EveR-4, at the robotics pavilion of Expo 2012 Yeosu Korea. Dr. Dong-Wook Lee (39) and a colleague have been developing the female androids since 2005, and suggests this version is capable of more realistic expressions thanks to a new artificial tongue.

EveR-4 stands 180 cm (5’11”) tall and was designed to look like a receptionist to give speeches and interact with people.  It attempts to replicate the complex assortment of muscles in the human head with no less than 30 motors, which Dr. Lee says is a world record.  He admits that there isn’t much demand for androids at the moment, but said that could change in the future as theme parks around the world adopt Disney’s automated approach.

The gynoid may also find work as a theater “actress” through synchronized voice performance, facial expressions, lip synch, and body gestures.  Previous models (and those built in Japan) are used to study not only human-robot interaction but the convergence of technology and the arts and humanities.  However, like most other androids it cannot stand up or walk under its own power, and its spoken lines would likely be performed by a human because artificial speech still leaves something to be desired.  Currently, only the HRP-4C developed by AIST uses Vocaloid software to produce a synthesized singing voice.

A small selection of photos follows after the break.

[sources: Hankyung, Daum (KR)]

Photos: Robot from CIROS 2012

The China (Shanghai) International Robot Exhibition 2012 (CIROS 2012) got underway yesterday, and some photos from the event have been trickling onto the web.  Although most of the robots on the 20,000 square meter show floor are industrial in nature, there are a few human-friendly examples on display.  Guangzhou CNC Equipment Co., Ltd. (GSK)’s industrial robot arm, for example, can be seen drawing pictures of Pandas.  The so-called civilian area of the expo is proving much more popular thanks to robots like Aldebaran Robotics’ NAO, MiniRobot’s Metal Fighter hobby kits, a Mars rover by Shanghai Jiaotong University, and Grandar Robotics’ Home Education Robot (photo below).

Unis is showing off the ILU-ROBO, the PaPeRo rip-off, which reacts differently depending on where you touch it thanks to eight sensors.  If you pat it hard on the head the robot pleads, “Don’t bully me!” and when touched elsewhere it explains matter-of-factly, “No need to scratch my itch”.

The robot (which is available in many bright colors) is intended for children 12 and younger and has a variety of functions including singing, dancing, storytelling, and English lessons.  According to the company the robot has sold approximately 10,000 units since it went on sale last year and is relatively cheap (at only 3,000 yuan [$475 USD]) thanks to the scale of its production.  It also helps that NEC did the design work and paid for the molds…

RT Corp Unveils RIC Ninja Master at Google I/O 2012

An interactive fighting game between human & robot

RT Corporation showcased a taller version of their tablet-powered “Robot Inside Character” at Google I/O 2012. The RIC Ninja Master stands 120 cm (or 3 ft 11 inches, including tablet), which is 30 cm (about 1 foot) taller than the company’s flagship model, the RIC-90.  Visitors to the interactive demo booth can play a fighting game with the full-sized robot, which is a world first according to the company.

Two people control the robot: one for the arms using an Xtion Pro Live motion sensor, and another makes the robot walk using a game controller.  The Android-based tablet (which serves as the robot’s head and face) uses the RT-ADK (a general I/O board specialized for Android OS) and V-sido software to translate the players’ commands into actual movements.

A sensor on the robot’s body determines when it has taken a hit, and it wears padded gloves to avoid hurting people.  Perhaps the gloves contain a sensor to detect when it has scored a hit, but I doubt it is agile enough to hit you if it is as slow as earlier models.  And RT Corp has published a video of its demonstration so we can see how it really works.

Video:

YouTube Preview Image

I’m guessing the human participants went a little easy on it, because it looks like it wouldn’t take much to knock it over in one punch…

[source: RT-net (EN)] via [MyNavi (JP)]