Application of the Karhunen-Loeve Procedure for the Characterization of Human Faces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Linear Discriminant Analysis for Two Classes via Removal of Classification Structure
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Brief Introduction to Boosting
IJCAI '99 Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence
Nonparametric discriminant analysis and nearest neighbor classification
Pattern Recognition Letters
Robust Real-Time Face Detection
International Journal of Computer Vision
Journal of Cognitive Neuroscience
Nonparametric Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Experimental Study of the Usefulness of External Face Features for Face Classification
Proceedings of the 2005 conference on Artificial Intelligence Research and Development
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Boosted online learning for face recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Boosting constrained mutual subspace method for robust image-set based object recognition
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Modular image principal component analysis for face recognition
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Edited AdaBoost by weighted kNN
Neurocomputing
Boosting k-NN for Categorization of Natural Scenes
International Journal of Computer Vision
Hi-index | 0.01 |
In this paper we introduce a new embedding technique to find the linear projection that best projects labeled data samples into a new space where the performance of a Nearest Neighbor classifier is maximized. We consider a large set of one-dimensional projections and combine them into a projection matrix, which is not restricted to be orthogonal. The embedding is defined as a classifier selection task that makes use of the AdaBoost algorithm to find an optimal set of discriminant projections. The main advantage of the algorithm is that the final projection matrix does not make any global assumption on the data distribution, and the projection matrix is created by minimizing the classification error in the training data set. Also the resulting features can be ranked according to a set of coefficients computed during the algorithm. The performance of our embedding is tested in two different pattern recognition tasks, a gender recognition problem and the classification of manuscript digits.