Application of the Karhunen-Loeve Procedure for the Characterization of Human Faces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
GTM: the generative topographic mapping
Neural Computation
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
The FERET Evaluation Methodology for Face-Recognition Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Self-Organizing Maps
A generalized kernel approach to dissimilarity-based classification
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
A Projection Pursuit Algorithm for Exploratory Data Analysis
IEEE Transactions on Computers
Journal of Cognitive Neuroscience
Hi-index | 0.00 |
We present a technique for low-dimensional representation of facial images that achieve graceful degradation of recognition performance. We have observed that if data is well-clustered into classes, features extracted from a topologically continuous transformation of the data are appropriate for recognition when low-dimensional features are to be used. Based on this idea, our technique is composed of two consecutive transformations of the input data. The first transformation is concerned with best separation of the input data into classes and the second focuses on the transformation that the distance relationship between data points before and after the transformation is kept as closely as possible. We employ FLD (Linear Discriminant Analysis) for the first transformation, and classical MDS (Multi-Dimensional Scaling) for the second transformation. We also present a nonlinear extension of the MDS by ‘kernel trick'. We have evaluated the recognition performance of our algorithms: FLD combined with MDS and FLD combined with kernel MDS. Experimental results using FERET facial image database show that the recognition performances degrade gracefully when low-dimensional features are used.