The TELESAR robots developed at Keio University’s Tachi Lab (see version 1 and 2) are fairly unusual even by telepresence robot standards. Unlike the previous versions, this one uses their sophisticated cylindrical display system called TWISTER to envelop the robot operator in a three-dimensional video screen. The video displayed inside TWISTER comes from a camera system called VORTEX situated on top of the robot, which provides the operator with a live 360 degree video feed of his surroundings in the remote environment. Outside of using a head-mounted display, this is probably the most immersive telepresence interface developed to date.
The operator can control the robot’s arm through a motion-capture set-up, while the hand is controlled using a data glove. This allows the operator to perform hand gestures, shake hands, or interact with objects. The operator’s other hand controls the robot’s movement using a joystick.
To make things even weirder, the robot is covered with retro-reflective material which displays a projected image that can only be seen through a special lens. Therefore, in order to see the person controlling the robot, you have to wear a head-mounted projector with glasses, which projects a live video feed of the operator onto the robot. They’ve even worked it out so that as you move around the robot, your changing view of the operator remains accurate. To better understand all of this it’s probably easier to just watch the video from IROS 2011.
As cool as the retro-reflective material and video projection is (it allows multiple people to see the robot operator from different angles simultaneously), it seems as though they’ll have to adopt a proper 3D display for the robot’s head as the project develops. It’s simply too cumbersome to have to wear a head-mounted projector with glasses if their aim is to provide smooth, natural interaction with bystanders. This particular video is already outdated, as Tachi Lab presented the TELESAR5 at the Haptic Media symposium from October 7th~11th 2011.