Emotional facial expression classification for multimodal user interfaces

  • Authors:
  • Eva Cerezo;Isabelle Hupont

  • Affiliations:
  • Departamento de Informática e Ingeniería de Sistemas, Universidad de Zaragoza, Zaragoza, Spain;Departamento de Informática e Ingeniería de Sistemas, Universidad de Zaragoza, Zaragoza, Spain

  • Venue:
  • AMDO'06 Proceedings of the 4th international conference on Articulated Motion and Deformable Objects
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distances, presence of wrinkles and mouth shape). The method defines and detects the six basic emotions (plus the neutral one) in terms of this information and has been fine-tuned with a data-base of 399 images. For the moment, the method is applied to static images. Application to sequences is being now developed. The extraction of such information about the user is of great interest for the development of new multimodal user interfaces