Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
SIAM Journal on Matrix Analysis and Applications
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
IEEE Transactions on Pattern Analysis and Machine Intelligence
SIAM Journal on Matrix Analysis and Applications
The Journal of Machine Learning Research
SIAM Journal on Matrix Analysis and Applications
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Adaptive Nonlinear Discriminant Analysis by Regularized Minimum Squared Errors
IEEE Transactions on Knowledge and Data Engineering
Kernel Uncorrelated and Orthogonal Discriminant Analysis: A Unified Approach
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
Least squares linear discriminant analysis
Proceedings of the 24th international conference on Machine learning
Nonlinear kernel-based statistical pattern analysis
IEEE Transactions on Neural Networks
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Face recognition using kernel direct discriminant analysis algorithms
IEEE Transactions on Neural Networks
Efficient and robust feature extraction by maximum margin criterion
IEEE Transactions on Neural Networks
Face recognition using kernel scatter-difference-based discriminant analysis
IEEE Transactions on Neural Networks
Hi-index | 0.08 |
It is well known that there exist two fundamental limitations in the linear discriminant analysis (LDA). One is that it cannot be applied when the within-class scatter matrix is singular, which is caused by the undersampled problem. The other is that it lacks the capability to capture the nonlinearly clustered structure of the data due to its linear nature. In this paper, a new kernel-based nonlinear discriminant analysis algorithm using minimum squared errors criterion (KDA-MSE) is proposed to solve these two problems. After mapping the original data into a higher-dimensional feature space using kernel function, the MSE criterion is used as the discriminant rule and the corresponding dimension reducing transformation is derived. Since the MSE solution does not require the scatter matrices to be nonsingular, the proposed KDA-MSE algorithm is applicable to the undersampled problem. Moreover, the new KDA-MSE algorithm can be applied to multiclass problem, whereas the existing MSE-based kernel discriminant methods are limited to handle twoclass data only. Extensive experiments, including object recognition and face recognition on three benchmark databases, are performed and the results demonstrate that our algorithm is competitive in comparison with other kernel-based discriminant techniques in terms of recognition accuracy.