A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Additive models, boosting, and inference for generalized divergences
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Boosting as entropy projection
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Prediction games and arcing algorithms
Neural Computation
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Machine Learning
A Column Generation Algorithm For Boosting
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Tracking the best linear predictor
The Journal of Machine Learning Research
The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins
The Journal of Machine Learning Research
Efficient Margin Maximizing with Boosting
The Journal of Machine Learning Research
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
Partial least squares regression for graph mining
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
ALT '08 Proceedings of the 19th international conference on Algorithmic Learning Theory
Intrinsic Geometries in Learning
Emerging Trends in Visual Computing
Boosting with structural sparsity
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Boosting through optimization of margin distributions
IEEE Transactions on Neural Networks
A low variance error boosting algorithm
Applied Intelligence
Trading Accuracy for Sparsity in Optimization Problems with Sparsity Constraints
SIAM Journal on Optimization
Solving semi-infinite linear programs using boosting-like methods
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
Leveraging k-NN for generic classification boosting
Neurocomputing
Margin optimization based pruning for random forest
Neurocomputing
Positive semidefinite metric learning using boosting-like algorithms
The Journal of Machine Learning Research
Fully corrective boosting with arbitrary loss and regularization
Neural Networks
Hi-index | 0.00 |
We consider boosting algorithms that maintain a distribution over a set of examples. At each iteration a weak hypothesis is received and the distribution is updated. We motivate these updates as minimizing the relative entropy subject to linear constraints. For example AdaBoost constrains the edge of the last hypothesis w.r.t. the updated distribution to be at most γ = 0. In some sense, AdaBoost is "corrective" w.r.t. the last hypothesis. A cleaner boosting method is to be "totally corrective": the edges of all past hypotheses are constrained to be at most γ, where γ is suitably adapted.Using new techniques, we prove the same iteration bounds for the totally corrective algorithms as for their corrective versions. Moreover with adaptive γ, the algorithms provably maximizes the margin. Experimentally, the totally corrective versions return smaller convex combinations of weak hypotheses than the corrective ones and are competitive with LPBoost, a totally corrective boosting algorithm with no regularization, for which there is no iteration bound known.