CEATEC JAPAN 2010 got underway today and the first footage of the robots on display is beginning to filter out. Astute readers may remember that at last year’s CEATEC the HRP-4C dressed up as Hatsune Miku, and was supposed to appear in a full-body dance routine called the “Dancing Robot Project“. Well, it appears they’re still ironing out the details because the HRP-4C’s latest performance is still limited to its upper body only. That said, I think the head and mouth movements are a little more convincing this year.
Nissan is playing a conceptual animation showing what we can expect from the upcoming electric vehicle revolution in a “3-D Theater”. Part of the video shows a solar “tree”, which is also being demonstrated with the help of Nissan’s EPORO robots.
Here is what one can expect to see in the 3-D Theater (this video is narrated in English). The video begins by showing smart phone alerts that let you know when your car is charged, followed by an electrical charging station (100% of the electricity is generated by the solar panels). Then, by 2020 it is suggested that cars will park themselves in special parking spots outfitted with non-contact power transmission. On a rainy day, the cars would transmit electricity to the building’s power grid to keep it running smoothly while cars close to empty would siphon energy from the same grid. By 2030 the roads will have non-contact power transmission lanes to recharge cars wirelessly while driving (this reminded me of futuristic racing games). Solar panels, solar tree forests, and wind turbines would provide zero emission electricity.
It would be cool if we could achieve even 50% of what this video presents. Given the ever-increasing gains made by solar panel technology and autonomous vehicles, I think the future shown in the video is entirely feasible. The real challenge will be integrating this technology into the existing infrastructure in a timely manner.
Fujitsu is showing their therapeutic robot bear. The bear has a camera in its nose which detects your facial expressions and gestures and responds with 300 actions. The robot’s ears fold, its eyes blink, and its mouth opens and closes. Its head can tilt, nod, and turn, and its arms and legs wiggle. It has 13 skin sensors, grip sensors in each hand, a tilt sensor, and a microphone and speaker.