Communications of the ACM
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
The Strength of Weak Learnability
Machine Learning
An improved boosting algorithm and its implications on learning complexity
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Cryptographic limitations on learning Boolean formulae and finite automata
Journal of the ACM (JACM)
Weakly learning DNF and characterizing statistical query learning using Fourier analysis
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
Toward Efficient Agnostic Learning
Machine Learning - Special issue on computational learning theory, COLT'92
Boosting a weak learning algorithm by majority
Information and Computation
An efficient membership-query algorithm for learning DNF with respect to the uniform distribution
Journal of Computer and System Sciences
An adaptive version of the boost by majority algorithm
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
More efficient PAC-learning of DNF with membership queries under the uniform distribution
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
On Using Extended Statistical Queries to Avoid Membership Queries
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
On Boosting with Optimal Poly-Bounded Distributions
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Smooth Boosting and Learning with Malicious Noise
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Hard-core distributions for somewhat hard problems
FOCS '95 Proceedings of the 36th Annual Symposium on Foundations of Computer Science
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
On boosting with polynomially bounded distributions
The Journal of Machine Learning Research
Hi-index | 0.00 |
We construct a boosting algorithm, which is the first both smooth and adaptive booster. These two features make it possible to achieve performance improvement for many learning tasks whose solution use a boosting technique.Originally, the boosting approach was suggested for the standard PAC model; we analyze possible applications of boosting in the model of agnostic learning (which is "more realistic" than PAC). We derive a lower bound for the final error achievable by boosting in the agnostic model; we show that our algorithm actually achieves that accuracy (within a constant factor of 2): When the booster faces distribution D, its final error is bounded above by 1/1/2-脽 errD(F) + 驴, where errD驴 (F) + 脽 is an upper bound on the error of a hypothesis received from the (agnostic) weak learner when it faces distribution D驴 and 驴 is any real, so that the complexity of the boosting is polynomial in 1/驴. We note that the idea of applying boosting in the agnostic model was first suggested by Ben-David, Long and Mansour and the above accuracy is an exponential improvement w.r.t. 脽 over their result ( 1/1/2-脽 errD(F)2(1/2-脽)2/ ln(1/脽-1) + 驴).Eventually, we construct a boosting "tandem", thus approaching in terms of O the lowest number of the boosting iterations possible, as well as in terms of 脮 the best possible smoothness. This allows solving adaptively problems whose solution is based on smooth boosting (like noise tolerant boosting and DNF membership learning), preserving the original solution's complexity.