Facial feature tracking for emotional dynamic analysis

  • Authors:
  • Thibaud Senechal;Vincent Rapp;Lionel Prevost

  • Affiliations:
  • ISIR, CNRS UMR 7222, Univ. Pierre et Marie Curie, Paris;ISIR, CNRS UMR 7222, Univ. Pierre et Marie Curie, Paris;LAMIA, Univ. of Fr. West Indies & Guyana

  • Venue:
  • ACIVS'11 Proceedings of the 13th international conference on Advanced concepts for intelligent vision systems
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This article presents a feature-based framework to automatically track 18 facial landmarks for emotion recognition and emotional dynamic analysis. With a new way of using multi-kernel learning, we combine two methods: the first matches facial feature points between consecutive images and the second uses an offline learning of the facial landmark appearance. Matching points results in a jitter-free tracking and the offline learning prevents the tracking framework from drifting. We train the tracking system on the Cohn-Kanade database and analyze the dynamic of emotions and Action Units on the MMI database sequences. We perform accurate detection of facial expressions temporal segment and report experimental results.