IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Database for Handwritten Text Recognition Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Nonparametric discriminant analysis and nearest neighbor classification
Pattern Recognition Letters
IEEE Transactions on Pattern Analysis and Machine Intelligence
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Journal of Machine Learning Research
Face Recognition by Stepwise Nonparametric Margin Maximum Criterion
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis
The Journal of Machine Learning Research
KPCA for semantic object extraction in images
Pattern Recognition
Local linear transformation embedding
Neurocomputing
Stepwise nearest neighbor discriminant analysis
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
IEEE Transactions on Neural Networks
Linear dimensionality reduction using relevance weighted LDA
Pattern Recognition
Dual-space linear discriminant analysis for face recognition
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Nonparametric Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Efficient and robust feature extraction by maximum margin criterion
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Supervised Fisher Linear Discriminant Analysis (LDA) is a classical dimensionality reduction approach. LDA assumes each class has a Gaussian density and may suffer from the singularity problem when handling high-dimensional data. We in this work consider more general class densities and show that optimizing LDA criterion cannot always achieve maximum class discrimination with the geometrical interpretation. By defining new marginal inter- and intra-class scatters, we elaborate a pairwise criteria based optimized LDA technique called robust linearly optimized discriminant analysis (LODA). A multimodal extension of LODA is also presented. In extracting the informative features, two effective solution schemes are proposed. The kernelized extension of our methods is also detailed. Compared with LDA, LODA has four significant advantages. First, LODA needs not the assumption on intra-class distributions. Second, LODA characterizes the inter-class separability with the marginal criterion. Third, LODA avoids the singularity problem and is robust to outliers. Fourth, the delivered projection matrix by LODA is orthogonal. These properties make LODA more general and suitable for discriminant analysis than using LDA. The delivered results of our investigated cases demonstrate that our methods are highly competitive with and even outperform some widely used state-of-the-art techniques.