Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fractional-Step Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
IEEE Transactions on Pattern Analysis and Machine Intelligence
SIAM Journal on Matrix Analysis and Applications
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
A Kernel Fractional-Step Nonlinear Discriminant Analysis for Pattern Recognition
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 2 - Volume 02
IEEE Transactions on Pattern Analysis and Machine Intelligence
SIAM Journal on Matrix Analysis and Applications
Face recognition using a kernel fractional-step discriminant analysis algorithm
Pattern Recognition
Feature selection in a kernel space
Proceedings of the 24th international conference on Machine learning
A comparison of generalized linear discriminant analysis algorithms
Pattern Recognition
Sparsity preserving projections with applications to face recognition
Pattern Recognition
An optimization criterion for generalized discriminant analysis on undersampled problems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
Linear discriminant analysis (LDA) is a classical approach for dimensionality reduction. However, LDA has limitations in that one of the scatter matrices is required to be nonsingular and the nonlinearly clustered structure is not easily captured. In order to overcome these problems, in this paper, we present several generalizations of kernel fuzzy discriminant analysis (KFDA) which include KFDA based on generalized singular value decomposition (KFDA/GSVD), pseudo-inverse KFDA (PIKFDA) and range space KFDA (RSKFDA). These KFDA-based algorithms adopts kernel methods to accommodate nonlinearly separable cases. In order to remedy the problem that KFDA-based algorithms fail to consider that different contribution of each pair of class to the discrimination, weighted schemes are incorporated into KFDA extensions in this paper and called them weighted generalized KF-DA algorithms. Experiments on three real-world data sets are performed to test and evaluate the effectiveness of the proposed algorithms and the effect of weights on classification accuracy. The results show that the effect of weighted schemes is very significantly.