Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning a Mahalanobis Metric from Equivalence Constraints
The Journal of Machine Learning Research
Learning Distance Metrics with Contextual Constraints for Image Retrieval
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis
The Journal of Machine Learning Research
Information-theoretic metric learning
Proceedings of the 24th international conference on Machine learning
Orthogonal Laplacianfaces for Face Recognition
IEEE Transactions on Image Processing
Hi-index | 0.00 |
Metric learning is the task of learning a distance metric from training data that reasonably identifies the important relationships between the data. An appropriate distance metric is of considerable importance for building accurate classifiers. In this paper, we propose a novel supervised metric learning method, nearest hit-misses component analysis. In our method, the margin is first defined with respect to the nearest hits (nearest neighbors from the same class) and the nearest misses (nearest neighbors from the different class), and then the distance metric is trained by maximizing the margin while minimizing the distance between each sample and its nearest hits. We further introduce a regularization term to alleviate overfitting. Moreover, the proposed method can perform metric learning and dimensionality reduction simultaneously. Comparative experiments with the state-of-the-art metric learning methods on various real-world data sets demonstrate the effectiveness of the proposed method.