MIT, Xitome Design, and UMASS Amherst are collaborating to develop a robot that combines novel mobility, “moderate dexterity, and human-centric communication and interaction abilities.” They refer to this class of robots as “MDS” for Mobile/Dexterous/Social.
The purpose of this platform is to support research and education goals in human-robot interaction, teaming, and social learning. Thus, it will be designed with holistic consideration for interacting with humans from its expressive face down to its small footprint.
The face has 15 degrees of freedom with actuators that control its gaze, eyebrows, eyelids and mandible. The computer renderings on the MIT site depict facial expressions that are very convincing in communicating some “emotional ” sentiment, or more likely, other non-verbal cues. The expressions are far more communicative than semicolons and parentheses. :)
The head has perceptual sensors such as an Active IR CCD camera, four microphones for sound localization, and a speaker for speech synthesis.
The torso sits on a mobile balancing platform like a miniaturized version of the Segway. It’s based on the uBot5 mobile manipulator developed by the Laboratory for Perceptual Robotics UMASS Amherst (directed by Rod Grupen). With a small footprint roughly the size of a small child, many robots will be able to work together within the lab and, presumably, it will allow them to interact within crowds unobtrusively.
Dr. Rodney Brooks wrote a paper in 1991 called Intelligence Without Reason that describes the notion of “Situatedness” as a requirement to emerge intelligence within a robot. He defined it as:
[being] situated in the world–they do not deal with abstract descriptions but with the here and now of the world directly influencing the behavior of the system. –Rodney Brooks, Intelligence Without Reason
The sharing of knowledge through social interactions from humans to robots is called Socially Situated Robot Learning (SSRL) which is an endeavor for this MDS project.
This project is funded in part by an ONR DURIP Award ‘Mobile, Desterous, Social Robots to Support Complex Human-Robot Teamwork in Uncertain Environments’, Award Number N00014-06-0516. It is also funded in part by a Microsoft Research Grant.
MIT says completion is targeted for fall 2007. We’ll be watching.