Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Discriminant Analysis of Principal Components for Face Recognition
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Solving the Small Sample Size Problem of LDA
ICPR '02 Proceedings of the 16 th International Conference on Pattern Recognition (ICPR'02) Volume 3 - Volume 3
Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalized low rank approximations of matrices
ICML '04 Proceedings of the twenty-first international conference on Machine learning
The Journal of Machine Learning Research
Null space versus orthogonal linear discriminant analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
Journal of Cognitive Neuroscience
Hi-index | 0.00 |
Over the years, many Linear Discriminant Analysis (LDA) algorithms have been proposed for the study of high dimensional data in a large variety of problems. An intrinsic limitation of classical LDA is the so-called "small sample size (3S) problem" that is, it fails when all scatter matrices are singular. Many LDA extensions were proposed in the past to overcome the 3S problems. However none of the previous methods could solve the 3S problem completely in the sense that it can keep all the discriminative features with a low computational cost. By applying LDA after whitening data, we proposed the Whitened LDA (WLDA) which can find the most discriminant features without facing the 3S problem. In WLDA, only eigenvalue problems instead of generalized eigenvalue problems are performed, leading to the low computation cost of WLDA. Experimental results are shown using two most popular Yale and ORL databases. Comparisons are given against Linear Discriminant Analysis (LDA), Direct LDA (DLDA), Null space LDA (NLDA) and several matrix-based subspace analysis approaches developed recently. We show that our method is always the best.