A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Prediction games and arcing algorithms
Neural Computation
Machine Learning
Linear Programming Boosting via Column Generation
Machine Learning
Logistic Regression, AdaBoost and Bregman Distances
Machine Learning
An introduction to boosting and leveraging
Advanced lectures on machine learning
Boosting as a Regularized Path to a Maximum Margin Classifier
The Journal of Machine Learning Research
Boosting, margins, and dynamics
Boosting, margins, and dynamics
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Unifying the error-correcting and output-code AdaBoost within the margin framework
ICML '05 Proceedings of the 22nd international conference on Machine learning
Totally corrective boosting algorithms that maximize the margin
ICML '06 Proceedings of the 23rd international conference on Machine learning
Efficient Margin Maximizing with Boosting
The Journal of Machine Learning Research
Parallelizing AdaBoost by weights dynamics
Computational Statistics & Data Analysis
Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework
Pattern Recognition Letters
Ensemble learning for free with evolutionary algorithms?
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Increasing the Robustness of Boosting Algorithms within the Linear-programming Framework
Journal of VLSI Signal Processing Systems
Prototype classification: Insights from machine learning
Neural Computation
Computational Statistics & Data Analysis
Rotation-based model trees for classification
International Journal of Data Analysis Techniques and Strategies
Artificial Intelligence Review
Discriminant subspace analysis: an adaptive approach for image classification
IEEE Transactions on Multimedia
Margin-based Ranking and an Equivalence between AdaBoost and RankBoost
The Journal of Machine Learning Research
A Refined Margin Analysis for Boosting Algorithms via Equilibrium Margin
The Journal of Machine Learning Research
Optimizing linear discriminant error correcting output codes using particle swarm optimization
ICANN'11 Proceedings of the 21st international conference on Artificial neural networks - Volume Part II
Margin-Based ranking meets boosting in the middle
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Classifier ensemble recommendation
ECCV'12 Proceedings of the 12th international conference on Computer Vision - Volume Part I
GA-Ensemble: a genetic algorithm for robust ensembles
Computational Statistics
The rate of convergence of AdaBoost
The Journal of Machine Learning Research
Hi-index | 0.00 |
In order to study the convergence properties of the AdaBoost algorithm, we reduce AdaBoost to a nonlinear iterated map and study the evolution of its weight vectors. This dynamical systems approach allows us to understand AdaBoost's convergence properties completely in certain cases; for these cases we find stable cycles, allowing us to explicitly solve for AdaBoost's output.Using this unusual technique, we are able to show that AdaBoost does not always converge to a maximum margin combined classifier, answering an open question. In addition, we show that "non-optimal" AdaBoost (where the weak learning algorithm does not necessarily choose the best weak classifier at each iteration) may fail to converge to a maximum margin classifier, even if "optimal" AdaBoost produces a maximum margin. Also, we show that if AdaBoost cycles, it cycles among "support vectors", i.e., examples that achieve the same smallest margin.