An Optimal Transformation for Discriminant and Principal Component Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
The FERET Evaluation Methodology for Face-Recognition Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Face recognition: A literature survey
ACM Computing Surveys (CSUR)
A Unified Framework for Subspace Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
An optimization criterion for generalized discriminant analysis on undersampled problems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
LDA is popularly used in the pattern recognition field Unfortunately LDA always confronts the small sample size problem (S3), which leads the within-class scatter matrix to be singular In this case, PCA is always used for dimensional reduction to solve the problem in practice This paper analyzes that when the small sample size problem happens, the PCA processing is not only to play the role of solving the S3 problem but also can be used to induce a fast calculation algorithm for solving the fisher criteria This paper will show that calculating the eigenvectors of within-class scatter matrix after dimensional reduction can solve the optimal projection for fisher criteria.