Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Training connectionist networks with queries and selective sampling
Advances in neural information processing systems 2
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Improving Generalization with Active Learning
Machine Learning - Special issue on structured connectionist systems
Journal of Computer and System Sciences
Generalized teaching dimensions and the query complexity of learning
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Selective Sampling Using the Query by Committee Algorithm
Machine Learning
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Query by committee, linear separation and random walks
Theoretical Computer Science
Information, Prediction, and Query by Committee
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Online Choice of Active Learning Algorithms
The Journal of Machine Learning Research
A bound on the label complexity of agnostic active learning
Proceedings of the 24th international conference on Machine learning
Repairing self-confident active-transductive learners using systematic exploration
Pattern Recognition Letters
Classification with a Reject Option using a Hinge Loss
The Journal of Machine Learning Research
Analysis of Perceptron-Based Active Learning
The Journal of Machine Learning Research
Version spaces: a candidate elimination approach to rule learning
IJCAI'77 Proceedings of the 5th international joint conference on Artificial intelligence - Volume 1
Theoretical foundations of active learning
Theoretical foundations of active learning
Teaching dimension and the complexity of active learning
COLT'07 Proceedings of the 20th annual conference on Learning theory
On the Foundations of Noise-free Selective Classification
The Journal of Machine Learning Research
Efficient active learning of halfspaces: an aggressive approach
The Journal of Machine Learning Research
Hi-index | 0.00 |
We discover a strong relation between two known learning models: stream-based active learning and perfect selective classification (an extreme case of 'classification with a reject option'). For these models, restricted to the realizable case, we show a reduction of active learning to selective classification that preserves fast rates. Applying this reduction to recent results for selective classification, we derive exponential target-independent label complexity speedup for actively learning general (non-homogeneous) linear classifiers when the data distribution is an arbitrary high dimensional mixture of Gaussians. Finally, we study the relation between the proposed technique and existing label complexity measures, including teaching dimension and disagreement coefficient.