How to construct random functions
Journal of the ACM (JACM)
Cryptographic limitations on learning Boolean formulae and finite automata
Journal of the ACM (JACM)
Boosting a weak learning algorithm by majority
Information and Computation
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
On the boosting ability of top-down decision tree learning algorithms
Journal of Computer and System Sciences
A Pseudorandom Generator from any One-way Function
SIAM Journal on Computing
Efficient distribution-free learning of probabilistic concepts
SFCS '90 Proceedings of the 31st Annual Symposium on Foundations of Computer Science
A boosting approach to remove class label noise
International Journal of Hybrid Intelligent Systems - Hybrid Intelligent systems in Ensembles
On agnostic boosting and parity learning
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Random classification noise defeats all convex potential boosters
Proceedings of the 25th international conference on Machine learning
An evolutionary random search algorithm for double auction markets
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
On noise-tolerant learning of sparse parities and related problems
ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
Classifying noisy data streams
FSKD'06 Proceedings of the Third international conference on Fuzzy Systems and Knowledge Discovery
Algorithms and hardness results for parallel large margin learning
The Journal of Machine Learning Research
Hi-index | 0.00 |
Boosting algorithms are procedures that ''boost'' low-accuracy weak learning algorithms to achieve arbitrarily high accuracy. Over the past decade boosting has been widely used in practice and has become a major research topic in computational learning theory. In this paper we study boosting in the presence of random classification noise, giving both positive and negative results. We show that a modified version of a boosting algorithm due to Mansour and McAllester (J. Comput. System Sci. 64(1) (2002) 103) can achieve accuracy arbitrarily close to the noise rate. We also give a matching lower bound by showing that no efficient black-box boosting algorithm can boost accuracy beyond the noise rate (assuming that one-way functions exist). Finally, we consider a variant of the standard scenario for boosting in which the ''weak learner'' satisfies a slightly stronger condition than the usual weak learning guarantee. We give an efficient algorithm in this framework which can boost to arbitrarily high accuracy in the presence of classification noise.