The Strength of Weak Learnability
Machine Learning
Boosting a weak learning algorithm by majority
Information and Computation
Prediction games and arcing algorithms
Neural Computation
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Greedy algorithms for classification—consistency, convergence rates, and adaptivity
The Journal of Machine Learning Research
Minimax nonparametric classification .I. Rates of convergence
IEEE Transactions on Information Theory
Boosted Classification Trees and Class Probability/Quantile Estimation
The Journal of Machine Learning Research
Boosting with pairwise constraints
Neurocomputing
Approximation stability and boosting
ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
A primal-dual convergence analysis of boosting
The Journal of Machine Learning Research
On the doubt about margin explanation of boosting
Artificial Intelligence
The rate of convergence of AdaBoost
The Journal of Machine Learning Research
Hi-index | 0.00 |
We give a review of various aspects of boosting, clarifying the issues through a few simple results, and relate our work and that of others to the minimax paradigm of statistics. We consider the population version of the boosting algorithm and prove its convergence to the Bayes classifier as a corollary of a general result about Gauss-Southwell optimization in Hilbert space. We then investigate the algorithmic convergence of the sample version, and give bounds to the time until perfect separation of the sample. We conclude by some results on the statistical optimality of the L2 boosting.