An Optimal Transformation for Discriminant and Principal Component Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Discriminant Waveletfaces and Nearest Feature Classifiers for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel Eigenfaces vs. Kernel Fisherfaces: Face Recognition Using Kernel Methods
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Class-Incremental Generalized Discriminant Analysis
Neural Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
An efficient algorithm for generalized discriminant analysis using incomplete Cholesky decomposition
Pattern Recognition Letters
Sparse multinomial kernel discriminant analysis (sMKDA)
Pattern Recognition
Improved wavelet feature extraction using kernel analysis for text independent speaker recognition
Digital Signal Processing
KDA plus KPCA for face recognition
ISNN'06 Proceedings of the Third international conference on Advnaces in Neural Networks - Volume Part II
A new discriminant subspace analysis approach for multi-class problems
Pattern Recognition
Feature Extraction Using a Complete Kernel Extension of Supervised Graph Embedding
Neural Processing Letters
Hi-index | 0.00 |
Generalized discriminant analysis (GDA) is an extension of the classical linear discriminant analysis (LDA) from linear domain to a nonlinear domain via the kernel trick. However, in the previous algorithm of GDA, the solutions may suffer from the degenerate eigenvalue problem (i.e., several eigenvectors with the same eigenvalue), which makes them not optimal in terms of the discriminant ability. In this letter, we propose a modified algorithm for GDA (MGDA) to solve this problem. The MGDA method aims to remove the degeneracy of GDA and find the optimal discriminant solutions, which maximize the between-class scatter in the subspace spanned by the degenerate eigenvectors of GDA. Theoretical analysis and experimental results on the ORL face database show that the MGDA method achieves better performance than the GDA method.