New Haven, Conn. Brian Scassellati, associate professor of computer science at Yale, has received an A. Richard Newton Breakthrough Research Award from Microsoft to design programs that will allow robots to understand the prosody or rhythms, patterns and intonation of speech so that they can be more effective in their interactions with humans.
As technology is embracing so-called social robots, research is expanding in the field of human-robot interaction (HRI). To explore some of the challenges in realizing the potential of HRI, Microsoft Research launched the Robots Among Us initiative last October with the bold declaration: The robots are coming!
Scassellati will share $500,000 with seven other investigators worldwide who received awards for work on HRI. His grant will fund development of prosody-recognition software, which will allow untrained users to provide feedback to a robot in human-robot interactions.
Its not what you say, it is how you say it, said Scassellati. Vocal prosody is the information contained in your tone of voice that conveys affect. It is a critical part of human-human communication that we hope to translate to humanrobot interactions.
Robots must be able to understand these indirect cues if they are going to have independent social interactions with humans, says Scassellati, who is studying how humans develop these skills for clues to how to recreate that ability in robots.
Infants pay attention to the tone of your voice your dog does it too the words don't matter. It has to do with the pitch, the cadence, the tone, said Scassellati. Infants only a few days old differentiate happy, consoling, and concerned voices regardless of the language being spoken. This is an incredibly useful tool for knowing when we are doing something right or doing something wrong. It would be useful for robots as well.
Few technologies currently exist to support this kind of robot recognition, said Avi Silberschatz, professor and chair of computer science at Yale. From a computer science perspective, this work is a way to give robots social feedback. A robot or computer that can recognize human social behaviors will have the ability to alter its own behavior, he added.
Prosody is language-independent which is exactly what we in robotics want, said Scassellati. We know that adults and dogs and infants all do this, although we dont know how the process works. But we can mathematically describe the process and program computers to do it. Partnering with Microsoft now allows us to build practical tools that anyone can use.
The Microsoft HRI initiative focuses attention on the general shift from robots as tools to social robots. It considers human-robot interaction similar to many other computing devices deployed in the modern human environment, including PCs, Smartphones and the World Wide Web. Robotic vacuum cleaners have even become a common new tool.
We are excited to partner with Yale University so robot developers will be better able to build robots that interact with humans in real-world environments, said Sailesh Chutani, senior director, Microsoft Research. Professor Scassellatis research aligns with Microsoft Researchs commitment to innovative research that has the potential to solve some of todays most challenging societal concerns.
This initiative is the latest in a series of robotics-related investments that Microsoft is making, including the Institute for Personal Robots in Education and Microsoft Robotics Studio. The Robotics Studio is a software development kit the company has made available since 2006, at no cost for academic and non-commercial use. Scassellatis project will use and extend this technology.
The A. Richard Newton Breakthrough Research Award is a tribute to the vision and accomplishments of A. Richard Newton, past professor and dean of the college of engineering at University of California, Berkeley, who died in January of 2007. Further information on the Robots Among Us program and the request for proposals is available at http://www.microsoft.com/presspass/features/2008/may08/05-16HRI.mspx
|Contact: Janet Rettig Emanuel|