Term-weighting approaches in automatic text retrieval
Information Processing and Management: an International Journal
Machine Learning
Machine Learning - Special issue on inductive transfer
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Learning to remove Internet advertisements
Proceedings of the third annual conference on Autonomous Agents
Analyzing the effectiveness and applicability of co-training
Proceedings of the ninth international conference on Information and knowledge management
Machine Learning
Selective Sampling with Redundant Views
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
ICTAI '04 Proceedings of the 16th IEEE International Conference on Tools with Artificial Intelligence
Tri-Training: Exploiting Unlabeled Data Using Three Classifiers
IEEE Transactions on Knowledge and Data Engineering
An experimental evaluation of ensemble methods for EEG signal classification
Pattern Recognition Letters
Label Propagation through Linear Neighborhoods
IEEE Transactions on Knowledge and Data Engineering
An RKHS for multi-view learning and manifold co-regularization
Proceedings of the 25th international conference on Machine learning
High Reliable Multi-View Semi-Supervised Learning with Extremely Sparse Labeled Data
HIS '08 Proceedings of the 2008 8th International Conference on Hybrid Intelligent Systems
Semi-supervised learning with very few labeled training examples
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Semi-Supervised Learning
A Modified Semi-Supervised Learning Algorithm on Laplacian Eigenmaps
Neural Processing Letters
View construction for multi-view semi-supervised learning
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part I
Hi-index | 0.00 |
Some recent successful semi-supervised learning methods construct more than one learner from both labeled and unlabeled data for inductive learning. This paper proposes a novel multiple-view multiple-learner (MVML) framework for semi-supervised learning, which differs from previous methods in possession of both multiple views and multiple learners. This method adopts a co-training styled learning paradigm in enlarging labeled data from a much larger set of unlabeled data. To the best of our knowledge it is the first attempt to combine the advantages of multiple-view learning and ensemble learning for semi-supervised learning. The use of multiple views is promising to promote performance compared with single-view learning because information is more effectively exploited. At the same time, as an ensemble of classifiers is learned from each view, predictions with higher accuracies can be obtained than solely adopting one classifier from the same view. Experiments on different applications involving both multiple-view and single-view data sets show encouraging results of the proposed MVML method.