Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Learning and making decisions when costs and probabilities are both unknown
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Metric-Based Methods for Adaptive Model Selection and Regularization
Machine Learning
Machine Learning
Class Probability Estimation and Cost-Sensitive Classification Decisions
ECML '02 Proceedings of the 13th European Conference on Machine Learning
Enhancing Supervised Learning with Unlabeled Data
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Combining Labeled and Unlabeled Data for Text Classification with a Large Number of Categories
ICDM '01 Proceedings of the 2001 IEEE International Conference on Data Mining
Learning from Labeled and Unlabeled Data using Graph Mincuts
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Unsupervised Improvement of Visual Detectors using Co-Training
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Active Sampling for Class Probability Estimation and Ranking
Machine Learning
Unsupervised word sense disambiguation rivaling supervised methods
ACL '95 Proceedings of the 33rd annual meeting on Association for Computational Linguistics
Co-training with a Single Natural Feature Set Applied to Email Classification
WI '04 Proceedings of the 2004 IEEE/WIC/ACM International Conference on Web Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
ICTAI '04 Proceedings of the 16th IEEE International Conference on Tools with Artificial Intelligence
Semisupervised learning from different information sources
Knowledge and Information Systems
Applying co-training methods to statistical parsing
NAACL '01 Proceedings of the second meeting of the North American Chapter of the Association for Computational Linguistics on Language technologies
Tri-Training: Exploiting Unlabeled Data Using Three Classifiers
IEEE Transactions on Knowledge and Data Engineering
Estimating the Posterior Probabilities Using the K-Nearest Neighbor Rule
Neural Computation
Naive Bayes Classification Given Probability Estimation Trees
ICMLA '06 Proceedings of the 5th International Conference on Machine Learning and Applications
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Email answering assistance by semi-supervised text classification
Intelligent Data Analysis
Top 10 algorithms in data mining
Knowledge and Information Systems
Analyzing Co-training Style Algorithms
ECML '07 Proceedings of the 18th European conference on Machine Learning
Watch, Listen & Learn: Co-training on Captioned Images and Videos
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
Co-training by Committee: A New Semi-supervised Learning Framework
ICDMW '08 Proceedings of the 2008 IEEE International Conference on Data Mining Workshops
IEEE Transactions on Knowledge and Data Engineering
Image retrieval using nonlinear manifold embedding
Neurocomputing
Online adaptive policies for ensemble classifiers
Neurocomputing
Semi-supervised learning by disagreement
Knowledge and Information Systems
Improve Computer-Aided Diagnosis With Machine Learning Techniques Using Undiagnosed Samples
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
k-nearest-neighbor Bayes-risk estimation
IEEE Transactions on Information Theory
Local estimation of posterior class probabilities to minimize classification errors
IEEE Transactions on Neural Networks
Inter-training: Exploiting unlabeled data in multi-classifier systems
Knowledge-Based Systems
L1 graph based on sparse coding for feature selection
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.01 |
Co-training is a well-known semi-supervised learning technique that applies two basic learners to train the data source, which uses the most confident unlabeled data to augment labeled data in the learning process. In the paper, we use the diversity of class probability estimation (DCPE) between two learners and propose the DCPE co-training approach. The key idea is to use DCPE to predict labels for the unlabeled data in the training process. The experimental studies with UCI data demonstrate that the DCPE co-training is robust and efficient in classification. The comparative studies with supervised learning methods and semi-supervised learning methods also demonstrate the effectiveness of the proposed approach.