A novel fusion-based method for expression-invariant gender classification

  • Authors:
  • Li Lu; Pengfei Shi

  • Affiliations:
  • Institute of Image Processing and Pattern Recognition, Shanghai Jiaotong University, China;Institute of Image Processing and Pattern Recognition, Shanghai Jiaotong University, China

  • Venue:
  • ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we propose a novel fusion-based gender classification method that is able to compensate for facial expression even when training samples contain only neutral expression. We perform experimental investigation to evaluate the significance of different facial regions in the task of gender classification. Three most significant regions are used in our fusion-based method. The classification is performed by using support vector machines based on the features extracted using two-dimension principal component analysis. Experiments show that our fusion-based method is able to compensate for facial expressions and obtained the highest correct classification rate of 95.33%.