Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Extended isomap for pattern classification
Eighteenth national conference on Artificial intelligence
Kernel Eigenfaces vs. Kernel Fisherfaces: Face Recognition Using Kernel Methods
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Journal of Cognitive Neuroscience
Data nonlinearity in exploratory multivariate analysis of language corpora
SigMorPhon '07 Proceedings of Ninth Meeting of the ACL Special Interest Group in Computational Morphology and Phonology
Curvilinear component analysis: a self-organizing neural network for nonlinear mapping of data sets
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Nonlinear dimensionality reduction for face recognition
IDEAL'09 Proceedings of the 10th international conference on Intelligent data engineering and automated learning
On nonlinear dimensionality reduction for face recognition
Image and Vision Computing
Hi-index | 0.00 |
Principal component analysis (PCA) has long been a simple, efficient technique for dimensionality reduction. However, many nonlinear methods such as local linear embedding and curvilinear component analysis have been proposed for increasingly complex nonlinear data recently. In this paper, we investigate and compare linear PCA and various nonlinear methods for face recognition. Results drawn from experiments on real-world face databases show that both linear and nonlinear methods yield similar performance and differences in classification rate are insignificant to conclude which method is always superior. A nonlinearity measure is derived to quantify the degree of nonlinearity of a data set in the reduced subspace. It can be used to indicate the effectiveness of nonlinear or linear dimensionality reduction.