Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Using Discriminant Eigenfeatures for Image Retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fractional-Step Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Empirical Performance Analysis of Linear Discriminant Classifiers
CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Face recognition using LDA-based algorithms
IEEE Transactions on Neural Networks
Cancer classification by gradient LDA technique using microarray gene expression data
Data & Knowledge Engineering
A feature selection method using fixed-point algorithm for DNA microarray gene expression data
International Journal of Knowledge-based and Intelligent Engineering Systems
A feature selection method using improved regularized linear discriminant analysis
Machine Vision and Applications
Hi-index | 0.00 |
The purpose of conventional linear discriminant analysis (LDA) is to find an orientation which projects high dimensional feature vectors of different classes to a more manageable low dimensional space in the most discriminative way for classification. The LDA technique utilizes an eigenvalue decomposition (EVD) method to find such an orientation. This computation is usually adversely affected by the small sample size problem. In this paper we have presented a new direct LDA method (called gradient LDA) for computing the orientation especially for small sample size problem. The gradient descent based method is used for this purpose. It also avoids discarding the null space of within-class scatter matrix and between-class scatter matrix which may have discriminative information useful for classification.