How to make a robot smile? perception of emotional expressions from digitally-extracted facial landmark configurations

  • Authors:
  • Caixia Liu;Jaap Ham;Eric Postma;Cees Midden;Bart Joosten;Martijn Goudbeek

  • Affiliations:
  • Human-Technology Interaction Group, Department of Industrial Engineering and Innovation Sciences, Eindhoven University of Technology, Eindhoven, The Netherlands,Tilburg Center for Cognition and Co ...;Human-Technology Interaction Group, Department of Industrial Engineering and Innovation Sciences, Eindhoven University of Technology, Eindhoven, The Netherlands;Tilburg Center for Cognition and Communication, Tilburg University, Tilburg, The Netherlands;Human-Technology Interaction Group, Department of Industrial Engineering and Innovation Sciences, Eindhoven University of Technology, Eindhoven, The Netherlands;Tilburg Center for Cognition and Communication, Tilburg University, Tilburg, The Netherlands;Tilburg Center for Cognition and Communication, Tilburg University, Tilburg, The Netherlands

  • Venue:
  • ICSR'12 Proceedings of the 4th international conference on Social Robotics
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

To design robots or embodied conversational agents that can accurately display facial expressions indicating an emotional state, we need technology to produce those facial expressions, and research that investigates the relationship between those technologies and human social perception of those artificial faces. Our starting point is assessing human perception of core facial information: Moving dots representing the facial landmarks, i.e., the locations and movements of the crucial parts of a face. Earlier research suggested that participants can relatively accurately identity facial expressions when all they can see of a real human full face are moving white painted dots representing the facial landmarks (although less accurate than recognizing full faces). In the current study we investigated the accuracy of recognition of emotions expressed by comparable facial landmarks (compared to accuracy of recognition of emotions expressed by full faces), but now used face-tracking software to produce the facial landmarks. In line with earlier findings, results suggested that participants could accurately identify emotions expressed by the facial landmarks (though less accurately than those expressed by full faces). Thereby, these results provide a starting point for further research on the fundamental characteristics of technology (AI methods) producing facial emotional expressions and their evaluation by human users.