A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting as entropy projection
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Multiclass learning, boosting, and error-correcting codes
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Linear Programming Boosting via Column Generation
Machine Learning
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
International Journal of Computer Vision - Special Issue on Content-Based Image Retrieval
Robust Real-Time Face Detection
International Journal of Computer Vision
Convex Optimization
Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework
Pattern Recognition Letters
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
On the Dual Formulation of Boosting Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
We proffer totally-corrective multi-class boosting algorithms in this work. First, we discuss the methods that extend two-class boosting to multi-class case by studying two existing boosting algorithms: AdaBoost.MO and SAMME, and formulate convex optimization problems that minimize their regularized cost functions. Then we propose a column-generation based totally-corrective framework for multi-class boosting learning by looking at the Lagrange dual problems. Experimental results on UCI datasets show that the new algorithms have comparable generalization capability but converge much faster than their counterparts. Experiments on MNIST handwriting digit classification also demonstrate the effectiveness of the proposed algorithms.