Communications of the ACM
The Strength of Weak Learnability
Machine Learning
Boosting a weak learning algorithm by majority
COLT '90 Proceedings of the third annual workshop on Computational learning theory
An improved boosting algorithm and its implications on learning complexity
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Learning decision trees using the Fourier spectrum
SIAM Journal on Computing
Cryptographic limitations on learning Boolean formulae and finite automata
Journal of the ACM (JACM)
Boosting a weak learning algorithm by majority
Information and Computation
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
An efficient membership-query algorithm for learning DNF with respect to the uniform distribution
Journal of Computer and System Sciences
An adaptive version of the boost by majority algorithm
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
More efficient PAC-learning of DNF with membership queries under the uniform distribution
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Optimally-Smooth Adaptive Boosting and Application to Agnostic Learning
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
An introduction to boosting and leveraging
Advanced lectures on machine learning
Nonlinear Boosting Projections for Ensemble Construction
The Journal of Machine Learning Research
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Hi-index | 0.00 |
We construct a framework which allows an algorithm to turn the distributions produced by some boosting algorithms into polynomially smooth distributions (w.r.t. the PAC oracle's distribution), with minimal performance loss.Further, we explore the case of Freund and Schapire's AdaBoost algorithm, bounding its distributions to polynomially smooth. The main advantage of AdaBoost over other boosting techniques is that it is adaptive, i.e., it is able to take advantage of weak hypotheses that are more accurate than it was assumed a priori. We show that the feature of adaptiveness is preserved and improved by our technique.Our scheme allows the execution of AdaBoost in the on-line boosting mode (i.e., to perform boosting "by filtering"). Executed this way (and possessing the quality of smoothness), now AdaBoost may be efficiently applied to a wider range of learning problems than before.In particular, we demonstrate AdaBoost's application to the task of DNF learning using membership queries. This application results in an algorithm that chooses the number of boosting iterations adaptively, and, consequently, adaptively chooses the size of the produced final hypothesis. This answers affirmatively a question posed by Jackson.