Facial asymmetry quantification for expression invariant human identification

  • Authors:
  • Yanxi Liu;Karen L. Schmidt;Jeffrey F. Cohn;Sinjini Mitra

  • Affiliations:
  • The Robotics Institute, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA;Department of Psychology, University of Pittsburgh, Pittsburgh, PA;The Robotics Institute, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA and Department of Psychology, University of Pittsburgh, Pittsburgh, PA;Statistics Department, Carnegie Mellon University, Pittsburgh, PA

  • Venue:
  • Computer Vision and Image Understanding - Special issue on Face recognition
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

We investigate facial asymmetry as a biometric under expression variation. For the first time, we have defined two types of quantified facial asymmetry measures that are easily computable from facial images and videos. Our findings show that the asymmetry measures of automatically selected facial regions capture individual differences that are relatively stable to facial expression variations. More importantly, a synergy is achieved by combining facial asymmetry information with conventional EigenFace and FisherFace methods. We have assessed the generality of these findings across two publicly available face databases: Using a random subset of 110 subjects from the FERET database, a 38% classification error reduction rate is obtained. Error reduction rates of 45-100% are achieved on 55 subjects from the Cohn-Kanade AU-Coded Facial Expression Database. These results suggest that facial asymmetry may provide complementary discriminative information to human identification methods, which has been missing in automatic human identification.