An evaluation of identity representability of facial expressions using feature distributions

  • Authors:
  • Qi Li;Chandra Kambhamettu;Jieping Ye

  • Affiliations:
  • Department of Computer Science, Western Kentucky University, Kentucky, USA;Department of Computer and Information Sciences, University of Delaware, USA;Department of Computer Science and Engineering, Arizona State University, USA

  • Venue:
  • Neurocomputing
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

The study on how to represent appearance instances was the focus in most previous work in face recognition. Little attention, however, was given to the problem of how to select ''good'' instances for a gallery, which may be called the facial identity representation problem. This paper gives an evaluation of the identity representability of facial expressions. The identity representability of an expression is measured by the recognition accuracy achieved by using its samples as the gallery data. We use feature distributions to represent appearance instances. A feature distribution of an image is based on the number of occurrence of detected interest points in regular grids of an image plane. We present a new algorithm of imbalance oriented candidate selection for interest point detection. Our experimental evaluation indicates that certain facial expressions, such as the neutral, have stronger identity representability than other expressions, in various feature distributions. An application of evaluation results towards improving linear discriminant analysis is further presented to show the value of our evaluation work.