COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
A sequential algorithm for training text classifiers
SIGIR '94 Proceedings of the 17th annual international ACM SIGIR conference on Research and development in information retrieval
Selective Sampling Using the Query by Committee Algorithm
Machine Learning
Toward Optimal Active Learning through Sampling Estimation of Error Reduction
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Query Learning Strategies Using Boosting and Bagging
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Less is More: Active Learning with Support Vector Machines
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Employing EM and Pool-Based Active Learning for Text Classification
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Learning from Labeled and Unlabeled Data using Graph Mincuts
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Support vector machine active learning with applications to text classification
The Journal of Machine Learning Research
Active learning using pre-clustering
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Diverse ensembles for active learning
ICML '04 Proceedings of the twenty-first international conference on Machine learning
An Active Learning Method Based on Most Possible Misclassification Sampling Using Committee
MDAI '07 Proceedings of the 4th international conference on Modeling Decisions for Artificial Intelligence
Active learning with multiple views
Journal of Artificial Intelligence Research
Active learning with statistical models
Journal of Artificial Intelligence Research
Soft label based Linear Discriminant Analysis for image recognition and retrieval
Computer Vision and Image Understanding
Hi-index | 0.00 |
By only selecting the most informative instances for labeling, active learning could reduce the labeling cost when labeled instances are hard to obtain. Facing the same situation, semi-supervised learning utilize unlabeled instances to strengthen classifiers trained on labeled instances under suitable assumptions. However, the current active learning methods often ignore such effect. Combining semi-supervised learning, we propose a graph-based active learning method, which can also handle multi-class problems, in the entropy reduction framework. The proposed method trains the base classifier using a popular graph-based semi-supervised label propagation method and samples the instance with the largest expected entropy reduction for labeling. The experiments show that the proposed method outperforms the traditional sampling methods on selected datasets.