A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Multiclass learning, boosting, and error-correcting codes
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Prediction games and arcing algorithms
Neural Computation
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
On the Learnability and Design of Output Codes for Multiclass Problems
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins
The Journal of Machine Learning Research
Backpropagation applied to handwritten zip code recognition
Neural Computation
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Boosting One-Class Support Vector Machines for Multi-Class Classification
Applied Artificial Intelligence
An Information Theoretic Perspective on Multiple Classifier Systems
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Totally-corrective multi-class boosting
ACCV'10 Proceedings of the 10th Asian conference on Computer vision - Volume Part IV
Coarse-to-fine multiclass learning and classification for time-critical domains
Pattern Recognition Letters
Hi-index | 0.10 |
Multi-class AdaBoost algorithms AdaBooost.MO, -ECC and -OC have received a great attention in the literature, but their relationships have not been fully examined to date. In this paper, we present a novel interpretation of the three algorithms, by showing that MO and ECC perform stage-wise functional gradient descent on a cost function defined over margin values, and that OC is a shrinkage version of ECC. This allows us to strictly explain the properties of ECC and OC, empirically observed in prior work. Also, the outlined interpretation leads us to introduce shrinkage as regularization in MO and ECC, and thus to derive two new algorithms: SMO and SECC. Experiments on diverse databases are performed. The results demonstrate the effectiveness of the proposed algorithms and validate our theoretical findings.