Selective Sampling Using the Query by Committee Algorithm
Machine Learning
Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Text Classification from Labeled and Unlabeled Documents using EM
Machine Learning - Special issue on information retrieval
Analyzing the effectiveness and applicability of co-training
Proceedings of the ninth international conference on Information and knowledge management
Modular Learning in Neural Networks: A Modularized Approach to Neural Network Classification
Modular Learning in Neural Networks: A Modularized Approach to Neural Network Classification
Three learning phases for radial-basis-function networks
Neural Networks
Constrained K-means Clustering with Background Knowledge
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Semi-supervised Clustering by Seeding
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Selective Sampling with Redundant Views
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
Combining Labeled and Unlabeled Data for Text Classification with a Large Number of Categories
ICDM '01 Proceedings of the 2001 IEEE International Conference on Data Mining
Inference for the Generalization Error
Machine Learning
Email classification with co-training
CASCON '01 Proceedings of the 2001 conference of the Centre for Advanced Studies on Collaborative research
Modular learning through output space decomposition
Modular learning through output space decomposition
Margin Trees for High-dimensional Classification
The Journal of Machine Learning Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Semi-supervised regression with co-training
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Multi-view forests based on Dempster-Shafer evidence theory: a new classifier ensemble method
SPPRA '08 Proceedings of the Fifth IASTED International Conference on Signal Processing, Pattern Recognition and Applications
Comparison of multiclass SVM decomposition schemes for visual object recognition
PR'05 Proceedings of the 27th DAGM conference on Pattern Recognition
Improve Computer-Aided Diagnosis With Machine Learning Techniques Using Undiagnosed Samples
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
PSL'11 Proceedings of the First IAPR TC3 conference on Partially Supervised Learning
Computer Methods and Programs in Biomedicine
Inter-training: Exploiting unlabeled data in multi-classifier systems
Knowledge-Based Systems
Editorial: Partially supervised learning for pattern recognition
Pattern Recognition Letters
Pattern classification and clustering: A review of partially supervised learning approaches
Pattern Recognition Letters
Hi-index | 0.00 |
Supervised learning requires a large amount of labeled data, but the data labeling process can be expensive and time consuming, as it requires the efforts of human experts. Co-Training is a semi-supervised learning method that can reduce the amount of required labeled data through exploiting the available unlabeled data to improve the classification accuracy. It is assumed that the patterns are represented by two or more redundantly sufficient feature sets (views) and these views are independent given the class. On the other hand, most of the real-world pattern recognition tasks involve a large number of categories which may make the task difficult. The tree-structured approach is an output space decomposition method where a complex multi-class problem is decomposed into a set of binary sub-problems. In this paper, we propose two learning architectures to combine the merits of the tree-structured approach and Co-Training. We show that our architectures are especially useful for classification tasks that involve a large number of classes and a small amount of labeled data where the single-view tree-structured approach does not perform well alone but when combined with Co-Training, it can exploit effectively the independent views and the unlabeled data to improve the recognition accuracy.