COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Active Learning Using Arbitrary Binary Valued Queries
Machine Learning
Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension
Machine Learning - Special issue on computational learning theory
Predicting {0, 1}-functions on randomly drawn points
Information and Computation
Selective Sampling Using the Query by Committee Algorithm
Machine Learning
Bounded Geometries, Fractals, and Low-Distortion Embeddings
FOCS '03 Proceedings of the 44th Annual IEEE Symposium on Foundations of Computer Science
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
A bound on the label complexity of agnostic active learning
Proceedings of the 24th international conference on Machine learning
Using the doubling dimension to analyze the generalization of learning algorithms
Journal of Computer and System Sciences
Teaching dimension and the complexity of active learning
COLT'07 Proceedings of the 20th annual conference on Learning theory
Hi-index | 0.00 |
We explore a general Bayesian active learning setting, in which the learner can ask arbitrary yes/no questions. We derive upper and lower bounds on the expected number of queries required to achieve a specified expected risk.