C4.5: programs for machine learning
C4.5: programs for machine learning
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Multiclass learning, boosting, and error-correcting codes
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Prediction games and arcing algorithms
Neural Computation
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
On the Learnability and Design of Output Codes for Multiclass Problems
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins
The Journal of Machine Learning Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Multiclass boosting with repartitioning
ICML '06 Proceedings of the 23rd international conference on Machine learning
Iterative RELIEF for Feature Weighting: Algorithms, Theories, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fusion of systems for automated cell phenotype image classification
Expert Systems with Applications: An International Journal
Similarity-margin based feature selection for symbolic interval data
Pattern Recognition Letters
Hi-index | 0.00 |
In this paper, we present a new interpretation of AdaBoost.ECC and AdaBoost.OC. We show that AdaBoost.ECC performs stage-wise functional gradient descent on a cost function, defined in the domain of margin values, and that AdaBoost.OC is a shrinkage version of AdaBoost.ECC. These findings strictly explain some properties of the two algorithms. The gradient-minimization formulation of AdaBoost.ECC allows us to derive a new algorithm, referred to as AdaBoost.SECC, by explicitly exploiting shrinkage as regularization in AdaBoost.ECC. Experiments on diverse databases confirm our theoretical findings. Empirical results show that AdaBoost.SECC performs significantly better than AdaBoost.ECC and AdaBoost.OC.