Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Analyzing the effectiveness and applicability of co-training
Proceedings of the ninth international conference on Information and knowledge management
Unsupervised word sense disambiguation rivaling supervised methods
ACL '95 Proceedings of the 33rd annual meeting on Association for Computational Linguistics
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Online and batch learning of pseudo-metrics
ICML '04 Proceedings of the twenty-first international conference on Machine learning
ICDM '04 Proceedings of the Fourth IEEE International Conference on Data Mining
Enhancing relevance feedback in image retrieval using unlabeled data
ACM Transactions on Information Systems (TOIS)
Information-theoretic metric learning
Proceedings of the 24th international conference on Machine learning
An empirical analysis of the probabilistic K-nearest neighbour classifier
Pattern Recognition Letters
Fast solvers and efficient implementations for distance metric learning
Proceedings of the 25th international conference on Machine learning
A Novel Method of Combined Feature Extraction for Recognition
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
Distance Metric Learning for Large Margin Nearest Neighbor Classification
The Journal of Machine Learning Research
Multi-view kernel construction
Machine Learning
Bayesian adaptive nearest neighbor
Statistical Analysis and Data Mining
Semi-supervised learning by disagreement
Knowledge and Information Systems
A probabilistic approach to nearest-neighbor classification: naive hubness bayesian kNN
Proceedings of the 20th ACM international conference on Information and knowledge management
Multiview Metric Learning with Global Consistency and Local Smoothness
ACM Transactions on Intelligent Systems and Technology (TIST)
Unlabeled data and multiple views
PSL'11 Proceedings of the First IAPR TC3 conference on Partially Supervised Learning
CoTrade: Confident Co-Training With Data Editing
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
We address the problem of metric learning for multi-view data. Many metric learning algorithms have been proposed, most of them focus just on single view circumstances, and only a few deal with multi-view data. In this paper, motivated by the co-training framework, we propose an algorithm-independent framework, named co-metric, to learn Mahalanobis metrics in multi-view settings. In its implementation, an off-the-shelf single-view metric learning algorithm is used to learn metrics in individual views of a few labeled examples. Then the most confidently-labeled examples chosen from the unlabeled set are used to guide the metric learning in the next loop. This procedure is repeated until some stop criteria are met. The framework can accommodate most existing metric learning algorithms whether types-of-side-information or example-labels are used. In addition it can naturally deal with semi-supervised circumstances under more than two views. Our comparative experiments demonstrate its competiveness and effectiveness.