An improved NN training scheme using two-stage LDA features for face recognition

  • Authors:
  • Behzad Bozorgtabar;Roland Goecke

  • Affiliations:
  • Human-Centred Computing Lab, University of Canberra, Australia;Human-Centred Computing Lab, University of Canberra, Australia,Research School of Computer Science, Australian National University, Canberra, Australia

  • Venue:
  • ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part V
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a new approach based on a Two-Stage Linear Discriminant Analysis (Two-Stage LDA) and Conjugate Gradient Algorithms (CGAs) for face recognition. A Two-Stage LDA technique is proposed that utilises the null space of the sample covariance matrix as well as using the range space of the between-class scatter matrix to extract discriminant information. Classic Back Propagation (BP) is a widely used Neural Network (NN) training algorithm in many detectors and classifiers. However, it is both too slow for many practical problems and its performance is not satisfactory in many application areas, including face recognition. To overcome these problems, four CGA algorithms (Fletcher-Reeves CGA, Polak-Ribiere CGA, Powell-Beale CGA, scaled CGA) have been proposed, the utility of which we investigate here in combination with Two-Stage LDA features. To further improve the accuracy, a modified AdaBoost.M1 approach was employed, which combines results of several NN classifiers as a single strong classifier. Experiments are performed on the ORL, FERET and AR face databases. The results show that all of the proposed methods lead to increased recognition rates and shorter training times compared to the classic BP.