Counting linear extensions is #P-complete
STOC '91 Proceedings of the twenty-third annual ACM symposium on Theory of computing
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Decision trees for geometric models
SCG '93 Proceedings of the ninth annual symposium on Computational geometry
On the Size of Weights for Threshold Gates
SIAM Journal on Discrete Mathematics
Improving Generalization with Active Learning
Machine Learning - Special issue on structured connectionist systems
Random walks and an O*(n5) volume algorithm for convex bodies
Random Structures & Algorithms
Selective Sampling Using the Query by Committee Algorithm
Machine Learning
Lectures on Discrete Geometry
Machine Learning
Machine Learning
Employing EM and Pool-Based Active Learning for Text Classification
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Support vector machine active learning with applications to text classification
The Journal of Machine Learning Research
ICML '06 Proceedings of the 23rd international conference on Machine learning
A bound on the label complexity of agnostic active learning
Proceedings of the 24th international conference on Machine learning
Importance weighted active learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
COLT'07 Proceedings of the 20th annual conference on Learning theory
Analysis of perceptron-based active learning
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Active learning via perfect selective classification
The Journal of Machine Learning Research
Hi-index | 0.00 |
We study pool-based active learning of half-spaces. We revisit the aggressive approach for active learning in the realizable case, and show that it can be made efficient and practical, while also having theoretical guarantees under reasonable assumptions. We further show, both theoretically and experimentally, that it can be preferable to mellow approaches. Our efficient aggressive active learner of half-spaces has formal approximation guarantees that hold when the pool is separable with a margin. While our analysis is focused on the realizable setting, we show that a simple heuristic allows using the same algorithm successfully for pools with low error as well. We further compare the aggressive approach to the mellow approach, and prove that there are cases in which the aggressive approach results in significantly better label complexity compared to the mellow approach. We demonstrate experimentally that substantial improvements in label complexity can be achieved using the aggressive approach, for both realizable and low-error settings.