Neural networks for pattern recognition
Neural networks for pattern recognition
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Face recognition: A literature survey
ACM Computing Surveys (CSUR)
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Selecting Principal Components in a Two-Stage LDA Algorithm
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
A Pyramidal Neural Network For Visual Pattern Recognition
IEEE Transactions on Neural Networks
Training feedforward networks with the Marquardt algorithm
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper presents a new approach based on a Two-Stage Linear Discriminant Analysis (Two-Stage LDA) and Conjugate Gradient Algorithms (CGAs) for face recognition. A Two-Stage LDA technique is proposed that utilises the null space of the sample covariance matrix as well as using the range space of the between-class scatter matrix to extract discriminant information. Classic Back Propagation (BP) is a widely used Neural Network (NN) training algorithm in many detectors and classifiers. However, it is both too slow for many practical problems and its performance is not satisfactory in many application areas, including face recognition. To overcome these problems, four CGA algorithms (Fletcher-Reeves CGA, Polak-Ribiere CGA, Powell-Beale CGA, scaled CGA) have been proposed, the utility of which we investigate here in combination with Two-Stage LDA features. To further improve the accuracy, a modified AdaBoost.M1 approach was employed, which combines results of several NN classifiers as a single strong classifier. Experiments are performed on the ORL, FERET and AR face databases. The results show that all of the proposed methods lead to increased recognition rates and shorter training times compared to the classic BP.