Listening to sad music while seeing a happy robot face

  • Authors:
  • Jiaming Zhang;Amanda J. C. Sharkey

  • Affiliations:
  • Neurocomputing and Robotics Group, Department of Computer Science, University of Sheffield, Sheffield, UK;Neurocomputing and Robotics Group, Department of Computer Science, University of Sheffield, Sheffield, UK

  • Venue:
  • ICSR'11 Proceedings of the Third international conference on Social Robotics
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Researchers have shown it is possible to develop robots that can produce recognizable emotional facial expressions [1, 2]. However, although human emotional expressions are known to be influenced by the surrounding context [7], there has been little research into the effect of context on the recognition of robot emotional expressions. The experiment reported here demonstrates that classical music can affect judgments of a robot's emotional facial expressions. Different judgments were made depending on whether the music was emotionally congruent or incongruent with the robot's expressions. A robot head produced sequences of expressions that were designed to demonstrate positive or negative emotions. The expressions were more likely to be recognized as intended when they occurred with music of a similar valence. Interestingly, it was observed that the robot face also influenced judgments about the classical music. Design implications for believable emotional robots are drawn.