Communications of the ACM
Back propagation is sensitive to initial conditions
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Statistical queries and faulty PAC oracles
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Machine Learning
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Analyzing the effectiveness and applicability of co-training
Proceedings of the ninth international conference on Information and knowledge management
Machine Learning
Enhancing Supervised Learning with Unlabeled Data
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Noise-tolerant learning, the parity problem, and the statistical query model
Journal of the ACM (JACM)
ICML '04 Proceedings of the twenty-first international conference on Machine learning
ICTAI '04 Proceedings of the 16th IEEE International Conference on Tools with Artificial Intelligence
Tri-Training: Exploiting Unlabeled Data Using Three Classifiers
IEEE Transactions on Knowledge and Data Engineering
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Semi-supervised classification with hybrid generative/discriminative methods
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Analyzing Co-training Style Algorithms
ECML '07 Proceedings of the 18th European conference on Machine Learning
Co-training with relevant random subspaces
Neurocomputing
Semi-supervised learning by disagreement
Knowledge and Information Systems
Error-correcting output codes: a general method for improving multiclass inductive learning programs
AAAI'91 Proceedings of the ninth National conference on Artificial intelligence - Volume 2
A new co-training-style random forest for computer aided diagnosis
Journal of Intelligent Information Systems
Multiple-View Multiple-Learner Semi-Supervised Learning
Neural Processing Letters
DCPE co-training for classification
Neurocomputing
Improve Computer-Aided Diagnosis With Machine Learning Techniques Using Undiagnosed Samples
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
A new relational Tri-training system with adaptive data editing for inductive logic programming
Knowledge-Based Systems
A hybrid generative/discriminative method for semi-supervised classification
Knowledge-Based Systems
Hi-index | 0.00 |
We present a new and more general co-training style framework named Inter-training, to exploit unlabeled data in multi-classifier systems, and develop two concrete algorithms which employ some new strategies to iteratively retrain base classifiers. The decrease of diversity during iterations is a main problem which hinders the further improvement of co-training style algorithms. In this paper, we propose a method to recreate diversity among base classifiers by manipulating the pseudo-labeled data for co-training style algorithms. Furthermore, in the theoretical aspect, we define a hybrid classification and distribution (HCAD) noise and provide a Probably Approximately Correct (PAC) analysis for co-training style algorithms in the presence of HCAD noise. Experimental results on six datasets show that our method performs much better in practice, and the superiority is especially obvious on hardly-classified datasets.