An integrated approach to emotional speech and gesture synthesis in humanoid robots

  • Authors:
  • Philipp Robbel;Mohammed E. Hoque;Cynthia Breazeal

  • Affiliations:
  • MIT Media Laboratory, Cambridge, MA;MIT Media Laboratory, Cambridge, MA;MIT Media Laboratory, Cambridge, MA

  • Venue:
  • Proceedings of the International Workshop on Affective-Aware Virtual Agents and Social Robots
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes an integrated approach to recognizing and generating affect on a humanoid robot as it interacts with a human user. We describe a method for detecting basic affect signals in the user's speech input and generate appropriately chosen responses on our robot platform. Responses are selected both in terms of content and emotional quality of the voice. Additionally, we synthesize gestures and facial expressions on the robot that magnify the effect of the conveyed emotional state of the robot. The guiding principle of our work is that adding the ability to detect and display emotion to physical agents allows their effective use in novel application areas such as child and elderly care, healthcare, education, and beyond.