Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose
IEEE Transactions on Pattern Analysis and Machine Intelligence
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel Eigenfaces vs. Kernel Fisherfaces: Face Recognition Using Kernel Methods
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
A fast kernel-based nonlinear discriminant analysis for multi-class problems
Pattern Recognition
Advanced Pattern Recognition Technologies with Applications to Biometrics
Advanced Pattern Recognition Technologies with Applications to Biometrics
A feature extraction method for use with bimodal biometrics
Pattern Recognition
Kernel machine-based one-parameter regularized Fisher discriminant method for face recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Kernel Fisher discriminant analysis (KFDA) extracts a nonlinear feature from a sample by calculating as many kernel functions as the training samples. Thus, its computational efficiency is inversely proportional to the size of the training sample set. In this paper we propose a more approach to efficient nonlinear feature extraction, FKFDA (fast KFDA). This FKFDA consists of two parts. First, we select a portion of training samples based on two criteria produced by approximating the kernel principal component analysis (AKPCA) in the kernel feature space. Then, referring to the selected training samples as nodes, we formulate FKFDA to improve the efficiency of nonlinear feature extraction. In FKFDA, the discriminant vectors are expressed as linear combinations of nodes in the kernel feature space, and the extraction of a feature from a sample only requires calculating as many kernel functions as the nodes. Therefore, the proposed FKFDA has a much faster feature extraction procedure compared with the naive kernel-based methods. Experimental results on face recognition and benchmark datasets classification suggest that the proposed FKFDA can generate well classified features.