Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Fractional-Step Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
IEEE Transactions on Pattern Analysis and Machine Intelligence
Solving the Small Sample Size Problem of LDA
ICPR '02 Proceedings of the 16 th International Conference on Pattern Recognition (ICPR'02) Volume 3 - Volume 3
Kernel Eigenfaces vs. Kernel Fisherfaces: Face Recognition Using Kernel Methods
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Discriminative Common Vectors for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
SIAM Journal on Matrix Analysis and Applications
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Face recognition using kernel direct discriminant analysis algorithms
IEEE Transactions on Neural Networks
Face recognition using LDA-based algorithms
IEEE Transactions on Neural Networks
Heteroscedastic Probabilistic Linear Discriminant Analysis with Semi-supervised Extension
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
Exploiting fisher and fukunaga-koontz transforms in chernoff dimensionality reduction
ACM Transactions on Knowledge Discovery from Data (TKDD)
Hi-index | 0.00 |
Many linear discriminant analysis (LDA) and kernel Fisher discriminant analysis (KFD) methods are based on the restrictive assumption that the data are homoscedastic. In this paper, we propose a new KFD method called heteroscedastic kernel weighted discriminant analysis (HKWDA) which has several appealing characteristics. First, like all kernel methods, it can handle nonlinearity efficiently in a disciplined manner. Second, by incorporating a weighting function that can capture heteroscedastic data distributions into the discriminant criterion, it can work under more realistic situations and hence can further enhance the classification accuracy in many real-world applications. Moreover, it can effectively deal with the small sample size problem. We have performed some face recognition experiments to compare HKWDA with several linear and nonlinear dimensionality reduction methods, showing that HKWDA consistently gives the best results.