Adjustment Learning and Relevant Component Analysis
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part IV
From Few to Many: Generative Models for Recognition Under Variable Pose and Illumination
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Integrating constraints and metric learning in semi-supervised clustering
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Boosting margin based distance functions for clustering
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Locally linear metric adaptation for semi-supervised clustering
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Feature selection, L1 vs. L2 regularization, and rotational invariance
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Cross-Generalization: Learning Novel Classes from a Single Example by Feature Replacement
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Building a Classification Cascade for Visual Identification from One Example
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
Learning Hierarchical Models of Scenes, Objects, and Parts
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
One-Shot Learning of Object Categories
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning a kernel function for classification with small training samples
ICML '06 Proceedings of the 23rd international conference on Machine learning
BoostMap: a method for efficient approximate similarity rankings
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Hi-index | 0.00 |
Small sample is an acute problem in many application domains, which may be partially addressed by feature selection or dimensionality reduction. For the purpose of distance learning, we describe a method for feature selection using equivalence constraints between pairs of datapoints. The method is based on L1 regularization and optimization. Feature selection is then incorporated into an existing nonparametric method for distance learning, which is based on the boosting of constrained generative models. Thus the final algorithm employs dynamical feature selection, where features are selected anew in each boosting iteration based on the weighted training data. We tested our algorithm on the classification of facial images, using two public domain databases. We show the results of extensive experiments where our method performed much better than a number of competing methods, including the original boosting-based distance learning method and two commonly used Mahalanobis metrics.