Automatic natural expression recognition using head movement and skin color features

  • Authors:
  • Hamed Monkaresi;Rafael A. Calvo;M. S. Hussain

  • Affiliations:
  • University of Sydney, Australia;University of Sydney, Australia;University of Sydney, Australia

  • Venue:
  • Proceedings of the International Working Conference on Advanced Visual Interfaces
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Significant progress has been made in automatic facial expression recognition, yet most state of the art approaches produce significantly better reliabilities on acted expressions than on natural ones. User interfaces that use facial expressions to understand user's affective states need to be most accurate during naturalistic interactions. This paper presents a study where head movement features are used to recognize naturalistic expressions of affect. The International Affective Picture System (IAPS) collection was used as stimulus for triggering different affective states. Machine learning techniques are applied to classify user's expressions based on their head position and skin color changes. The proposed approach shows a reasonable accuracy in detecting three levels of valence and arousal for user-dependent model during naturalistic human-computer interaction.