The design and analysis of efficient learning algorithms
The design and analysis of efficient learning algorithms
C4.5: programs for machine learning
C4.5: programs for machine learning
Boosting a weak learning algorithm by majority
Information and Computation
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Additive models, boosting, and inference for generalized divergences
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Boosting as entropy projection
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Machine Learning
Sparse Regression Ensembles in Infinite and Finite Hypothesis Spaces
Machine Learning
Maximizing the Margin with Boosting
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
A Column Generation Algorithm For Boosting
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins
The Journal of Machine Learning Research
A statistical framework for genomic data fusion
Bioinformatics
Learning interpretable SVMs for biological sequence classification
RECOMB'05 Proceedings of the 9th Annual international conference on Research in Computational Molecular Biology
Totally corrective boosting algorithms that maximize the margin
ICML '06 Proceedings of the 23rd international conference on Machine learning
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
ALT '08 Proceedings of the 19th international conference on Algorithmic Learning Theory
Prototype classification: Insights from machine learning
Neural Computation
A Learning Algorithm of Boosting Kernel Discriminant Analysis for Pattern Recognition
IEICE - Transactions on Information and Systems
Ensembles of partially trained SWMs with multiplicative updates
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Boosting through optimization of margin distributions
IEEE Transactions on Neural Networks
Handling missing features with boosting algorithms for protein-protein interaction prediction
DILS'10 Proceedings of the 7th international conference on Data integration in the life sciences
AdaBoost classifiers for pecan defect classification
Computers and Electronics in Agriculture
A Refined Margin Analysis for Boosting Algorithms via Equilibrium Margin
The Journal of Machine Learning Research
Approximate reduction from AUC maximization to 1-norm soft margin optimization
ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
Solving semi-infinite linear programs using boosting-like methods
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
Margin optimization based pruning for random forest
Neurocomputing
A theory of multiclass boosting
The Journal of Machine Learning Research
Multi-instance learning with any hypothesis class
The Journal of Machine Learning Research
Smoothed emphasis for boosting ensembles
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
Fully corrective boosting with arbitrary loss and regularization
Neural Networks
The rate of convergence of AdaBoost
The Journal of Machine Learning Research
Hi-index | 0.00 |
AdaBoost produces a linear combination of base hypotheses and predicts with the sign of this linear combination. The linear combination may be viewed as a hyperplane in feature space where the base hypotheses form the features. It has been observed that the generalization error of the algorithm continues to improve even after all examples are on the correct side of the current hyperplane. The improvement is attributed to the experimental observation that the distances (margins) of the examples to the separating hyperplane are increasing even after all examples are on the correct side.We introduce a new version of AdaBoost, called AdaBoost*ν, that explicitly maximizes the minimum margin of the examples up to a given precision. The algorithm incorporates a current estimate of the achievable margin into its calculation of the linear coefficients of the base hypotheses. The bound on the number of iterations needed by the new algorithms is the same as the number needed by a known version of AdaBoost that must have an explicit estimate of the achievable margin as a parameter. We also illustrate experimentally that our algorithm requires considerably fewer iterations than other algorithms that aim to maximize the margin.