How to construct random functions
Journal of the ACM (JACM)
The Strength of Weak Learnability
Machine Learning
Cryptographic limitations on learning Boolean formulae and finite automata
Journal of the ACM (JACM)
An introduction to computational learning theory
An introduction to computational learning theory
Boosting a weak learning algorithm by majority
Information and Computation
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Specification and simulation of statistical query algorithms for efficiency and noise tolerance
Journal of Computer and System Sciences - Special issue on the eighth annual workshop on computational learning theory, July 5–8, 1995
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
On the boosting ability of top-down decision tree learning algorithms
Journal of Computer and System Sciences
A Pseudorandom Generator from any One-way Function
SIAM Journal on Computing
Rounds vs queries trade-off in noisy computation
SODA '05 Proceedings of the sixteenth annual ACM-SIAM symposium on Discrete algorithms
Lower Bounds for the Noisy Broadcast Problem
FOCS '05 Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science
Error limiting reductions between classification tasks
ICML '05 Proceedings of the 22nd international conference on Machine learning
Robust Loss Functions for Boosting
Neural Computation
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Sensitive error correcting output codes
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Hi-index | 0.00 |
Boosting algorithms are procedures that "boost" low accuracy weak learning algorithms to achieve arbitrarily high accuracy. Over the past decade boosting has been widely used in practice and has become a major research topic in computational learning theory. In this paper we study boosting in the presence of random classification noise, giving both positive and negative results.We show that a modified version of a boosting algorithm due to Mansour and McAllester [14] can achieve accuracy arbitrarily close to the noise rate. We also give a matching lower bound by showing that no efficient black-box boosting algorithm can boost accuracy beyond the noise rate (assuming that one-way functions exist). Finally, we consider a variant of the standard scenario for boosting in which the "weak learner" satisfies a slightly stronger condition than the usual weak learning guarantee. We give an efficient algorithm in this framework which can boost to arbitrarily high accuracy in the presence of classification noise.