Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Clustering with Instance-level Constraints
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Integrating constraints and metric learning in semi-supervised clustering
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Boosting margin based distance functions for clustering
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Locally linear metric adaptation for semi-supervised clustering
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning a Mahalanobis Metric from Equivalence Constraints
The Journal of Machine Learning Research
Semi-supervised metric learning using pairwise constraints
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Robust distance metric learning with auxiliary knowledge
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Joint learning of labels and distance metric
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on game theory
Hi-index | 0.00 |
In recent years, metric learning in the semisupervised setting has aroused a lot of research interests. One type of semi-supervised metric learning utilizes supervisory information in the form of pairwise similarity or dissimilarity constraints. However, most methods proposed so far are either limited to linear metric learning or unable to scale up well with the data set size. In this paper, we propose a nonlinear metric learning method based on the kernel approach. By applying low-rank approximation to the kernel matrix, our method can handle significantly larger data sets. Moreover, our low-rank approximation scheme can naturally lead to out-of-sample generalization. Experiments performed on both artificial and real-world data show very promising results.