Face recognition subject to variations in facial expression, illumination and pose using correlation filters

  • Authors:
  • Martin David Levine;Yingfeng Yu

  • Affiliations:
  • McGill University, Electrical and Computer Engineering, Center for Intelligent Machines, Montreal, Que., Canada;McGill University, Electrical and Computer Engineering, Center for Intelligent Machines, Montreal, Que., Canada

  • Venue:
  • Computer Vision and Image Understanding
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we have selected some recent advanced correlation filters: minimum average correlation filter (MACE), unconstrained MACE filter (UMACE), phase-only unconstrained MACE filter (POUMACE), distance-classifier correlation filter (DCCF) [B.V.K. Vijaya Kumar, D. Casasent, A. Mahalanobis, Distance-classifier correlation filters for multiclass target recognition. Appl. Opt. 35 (1996) 3127-3133] and minimax distance transform correlation filter (MDTC) and used them to test recognition performance in different situations involving variations in facial expression, illumination conditions and head pose. The paper introduces the first application of correlation filter classifiers to facial images subject to head pose variations. It also demonstrates that it is possible to obtain illumination invariance without using any training images for this purpose. A comparison of MDTC with traditional discriminant learning methods (e.g., KPCA [Scholikopf, B., Smola, A., Muller, K.R., Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput., 10 (1999) 1299-1319], IPCA [16], GDA [Baudat, G., Anouar, F., Generalized discriminant analysis using a kernel approach. Neural Comput., 12 (2000) 2385-2404], R-KDA [Lu, J., Plataniotis, K., Venetsanopoulos, A. Regularization studies of linear discriminant analysis in small sample size scenarios with application to face recognition. Pattern Recogn. Lett., 26 (2) (2005) 181-191)] is also presented. The paper shows that correlation filter classifiers, a relatively unheralded model-based approach, have a greater robustness and accuracy than traditional appearance-based methods (such as PCA). Overall, the POUMACE filter provided the best choice for facial matching. It achieved 100% accuracy on the publicly available CMU facial expression database and the Yale frontal face illumination database, and slightly less in the head pose experiments.