Facial emotion recognition with expression energy

  • Authors:
  • Albert C. Cruz;Bir Bhanu;Ninad Thakoor

  • Affiliations:
  • University of California, Riverside, Riverside, CA, USA;University of California, Riverside, Riverside, CA, USA;University of California, Riverside, Riverside, CA, USA

  • Venue:
  • Proceedings of the 14th ACM international conference on Multimodal interaction
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Facial emotion recognition, the inference of an emotion from apparent facial expressions, in unconstrained settings is a typical case where algorithms perform poorly. A property of the AVEC2012 data set is that individuals in testing data are not encountered in training data. In these situations, conventional approaches suffer because models developed from training data cannot properly discriminate unforeseen testing samples. Additional information beyond the feature vectors is required for successful detection of emotions. We propose two similarity metrics that address the problems of a conventional approach: neutral similarity, measuring the intensity of an expression; and temporal similarity, measuring changes in an expression over time. These similarities are taken to be the energy of facial expressions, measured with a SIFT-based warping process. Our method improves correlation by 35.5% over the baseline approach on the frame-level sub-challenge.