Locally linear metric adaptation for semi-supervised clustering
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning a Mahalanobis Metric from Equivalence Constraints
The Journal of Machine Learning Research
Learning a Mahalanobis distance metric for data clustering and classification
Pattern Recognition
Semi-supervised metric learning using pairwise constraints
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Semi-supervised clustering with metric learning: An adaptive kernel method
Pattern Recognition
Kernel-based metric learning for semi-supervised clustering
Neurocomputing
Distance metric learning guided adaptive subspace semi-supervised clustering
Frontiers of Computer Science in China
Guided Locally Linear Embedding
Pattern Recognition Letters
Learning low-rank kernel matrices for constrained clustering
Neurocomputing
Probabilistic non-linear distance metric learning for constrained clustering
Proceedings of the 4th MultiClust Workshop on Multiple Clusterings, Multi-view Data, and Multi-source Knowledge-driven Clustering
Hi-index | 0.01 |
Relevant component analysis (RCA) is a recently proposed metric learning method for semi-supervised learning applications. It is a simple and efficient method that has been applied successfully to give impressive results. However, RCA can make use of supervisory information in the form of positive equivalence constraints only. In this paper, we propose an extension to RCA that allows both positive and negative equivalence constraints to be incorporated. Experimental results show that the extended RCA algorithm is effective.