MIT’s Personal Robotics Group & Senseable City Lab are working with Audi on a project called The Sociable Car, which aims to expand the human-car relationship. Much like Nissan’s robotic agent Pivo-kun and Pioneer’s Driving Partner Robot, AIDA (Affective Intelligent Driving Agent) sits in the dashboard of your car and provides information about your surroundings. It has a high resolution display which communicates expressions or symbols. The robot picks up data from sensors located throughout your vehicle, such as its location via GPS, which it uses to learn your driving habits.
Unlike traditional navigation systems which tend to provide information solely based on the desired destination, AIDA would act more like a driving companion. For example, if you tend to stop by the grocery store every Thursday after work, the robot could suggest a time-saving route. It also uses speech recognition and speech synthesis for natural language communication with the driver and passengers. The idea is to make the driving experience safer, more efficient, and more fun by keeping an eye on your behavioral and emotional state. If you get drowsy while driving, the robot might sound an alarm to wake you up. If you seem to be driving with a tinge of road rage, it might try to calm you down.
The current prototype is designed to sit in the dashboard of your car, but I think it might obstruct your view or become distracting in its current form, which consists of a blinky head attached to a long neck. Hopefully AIDA helps avoid more accidents than it causes! And if you thought people loved their cars before, just wait until they can hold a conversation with you. Videos after the break.
MIT Senseable City Lab | MIT Personal Robotics Group