Expressing Complex Mental States Through Facial Expressions

  • Authors:
  • Xueni Pan;Marco Gillies;Tevfik Metin Sezgin;Celine Loscos

  • Affiliations:
  • Department of Computer Science, University College London, London, UK;Department of Computer Science, University College London, London, UK;Department of Computer Science, Cambridge University, Cambridge, UK;Department of Computer Science, University College London, London, UK and Departament d'Informàtica i Matemàtica Aplicada, Universitat de Girona, Spain

  • Venue:
  • ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
  • Year:
  • 2007
  • Motion graphs

    Proceedings of the 29th annual conference on Computer graphics and interactive techniques

Quantified Score

Hi-index 0.00

Visualization

Abstract

A face is capable of producing about twenty thousand different facial expressions [2]. Many researchers on Virtual Characters have selected a limited set of emotional facial expressions and defined them as basic emotions, which are universally recognized facial expressions. These basic emotions have been well studied since 1969 and employed in many applications [3]. However, real life communication usually entails more complicated emotions. For instance, communicative emotions like "convinced", "persuaded" and "bored" are difficult to describe adequately with basic emotions. Our daily face-to-face interaction is already accompanied by more complex mental states, so an empathic animation system should support them. Compared to basic emotions, complex mental states are harder to model because they require knowledge of temporal changes in facial displays and head movements as opposed to a static snapshot of the facial expression. We address this by building animation models for complex emotions based on video clips of professional actors displaying these emotions.