Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Convex Optimization
Unsupervised learning of image manifolds by semidefinite programming
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Distance metric learning vs. Fisher discriminant analysis
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Relaxed pairwise learned metric for person re-identification
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part VI
Hi-index | 0.00 |
Dimensionality reduction is a much-studied task in machine learning in which high-dimensional data is mapped, possibly via a non-linear transformation, onto a low-dimensional manifold. The resulting embeddings, however, may fail to capture features of interest. One solution is to learn a distance metric which prefers embeddings that capture the salient features. We propose a novel approach to learning a metric from side information to guide the embedding process. Our approach admits the use of two kinds of side information. The first kind is class-equivalence information, where some limited number of pairwise "same/different class" statements are known. The second form of side information is a limited set of distances between pairs of points in the target metric space. We demonstrate the effectiveness of the method by producing embeddings that capture features of interest.