Information Processing Letters
Ulam's searching game with lies
Journal of Combinatorial Theory Series A
Selective Sampling Using the Query by Committee Algorithm
Machine Learning
An efficient membership-query algorithm for learning DNF with respect to the uniform distribution
Journal of Computer and System Sciences
Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Text Classification from Labeled and Unlabeled Documents using EM
Machine Learning - Special issue on information retrieval
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Machine Learning
Machine Learning
Transductive Inference for Text Classification using Support Vector Machines
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
Learning from Labeled and Unlabeled Data using Graph Mincuts
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Learning monotone dnf from a teacher that almost does not answer membership queries
The Journal of Machine Learning Research
Unsupervised Improvement of Visual Detectors using Co-Training
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Theoretical Computer Science - Special issue: Algorithmic learning theory
Information Processing and Management: an International Journal
A bound on the label complexity of agnostic active learning
Proceedings of the 24th international conference on Machine learning
Minimax bounds for active learning
COLT'07 Proceedings of the 20th annual conference on Learning theory
COLT'07 Proceedings of the 20th annual conference on Learning theory
Active learning in the non-realizable case
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
A PAC-Style model for learning from labeled and unlabeled data
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Analysis of perceptron-based active learning
COLT'05 Proceedings of the 18th annual conference on Learning Theory
On the sample complexity of PAC learning half-spaces against the uniform distribution
IEEE Transactions on Neural Networks
A discriminative model for semi-supervised learning
Journal of the ACM (JACM)
Rademacher Complexities and Bounding the Excess Risk in Active Learning
The Journal of Machine Learning Research
Theoretical Computer Science
Plug-in approach to active learning
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Active sampling for entity matching
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Activized learning: transforming passive to active with improved label complexity
The Journal of Machine Learning Research
A theory of transfer learning with applications to active learning
Machine Learning
Active Sampling for Entity Matching with Guarantees
ACM Transactions on Knowledge Discovery from Data (TKDD) - Special Issue on ACM SIGKDD 2012
Distribution-dependent sample complexity of large margin learning
The Journal of Machine Learning Research
Hi-index | 0.00 |
We state and analyze the first active learning algorithm that finds an @e-optimal hypothesis in any hypothesis class, when the underlying distribution has arbitrary forms of noise. The algorithm, A^2 (for Agnostic Active), relies only upon the assumption that it has access to a stream of unlabeled examples drawn i.i.d. from a fixed distribution. We show that A^2 achieves an exponential improvement (i.e., requires only O(ln1@e) samples to find an @e-optimal classifier) over the usual sample complexity of supervised learning, for several settings considered before in the realizable case. These include learning threshold classifiers and learning homogeneous linear separators with respect to an input distribution which is uniform over the unit sphere.