Contacts: Laurel Riek, Cambridge Computer Lab
Mentors: Dr. Nicky Athanassopoulou, Institute for Manufacturing & Simon C. R. Lewis
In the future robots may become as ubiquitous as mobile phones. Such robots are expected to serve as home health aids and companions, assist with housework, and provide entertainment and education to their users. However, before these uses can be realized, it is necessary to ensure people can interact with the robots in ways that are comfortable and natural to them.
Researchers at the Cambridge Computer Lab are working on developing robots that are capable of sensing, understanding, and generating the full range of human language, with a particular focus on non-verbal behavior. The main goal of this research is to make machines which are easier to use and interact with.
The lab has several humanoid and zoomorphic robots, including Charles, which is a very realistic humanoid robot head. They have developed a system that allows Charles to reproduce human facial movements. The system records facial movements on video, analyses this to track 22 key points on the face, and then translates these into motor movements on the robot. This allows the robot to have very lifelike movement.
The goal of the i-Teams project is to look for realistic short-term uses of an expressive robot like Charles, focusing initially on its potential use in medical training. In particular, one angle might be to use the robot as a very realistic patient simulator, able to recreate a variety of conditions and behaviors and allow for real-time interaction. Such a simulator could be used for training student clinicians. Although clinical students currently meet sample patients and use (inexpressive) medical mannequins throughout their training, they cannot possibly see every possible condition before they qualify. The challenge for the i-Team is to identify the areas where the robot can bring greatest value, by discussing the system with medical professionals, lecturers and students.