Identity Representability of Facial Expressions: An Evaluation Using Feature Pixel Distributions

  • Authors:
  • Qi Li;Chandra Kambhamettu

  • Affiliations:
  • Western Kentucky University, USA;University of Delaware, USA

  • Venue:
  • ICMLA '06 Proceedings of the 5th International Conference on Machine Learning and Applications
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

The study on how to represent appearance instances was the focus in most previous work in face recognition. Little attention, however, was given to the problem of how to select "good" instances for a gallery, which may be called the facial identity representation problem. This paper gives an evaluation of the identity representability of facial expressions. The identity representability of an expression is measured by the recognition accuracy achieved by using its samples as the gallery data. We use feature pixel distributions to represent appearance instances. A feature pixel distribution of an image is based on the number of occurrence of detected feature pixels (corners) in regular grids of an image plane. We propose imbalance oriented redundancy reduction for feature pixel detection. Our experimental evaluation indicates that certain facial expressions, such as the neutral, have stronger identity representability than other expressions, in various feature pixel distributions.