Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Proceedings of the 1998 conference on Advances in neural information processing systems II
Machine Learning
Some Theoretical Aspects of Boosting in the Presence of Noisy Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
AdaBoost.OC has shown to be an effective method in boosting "weak" binary classifiers for multi-class learning. It employs the Error Correcting Output Code (ECOC) method to convert a multi-class learning problem into a set of binary classification problems, and applies the AdaBoost algorithm to solve them efficiently. In this paper, we propose a new boosting algorithm that improves the AdaBoost.OC algorithm in two aspects: 1) It introduces a smoothing mechanism into the boosting algorithm to alleviate the potential overfitting problem with the AdaBoost algorithm, and 2) It introduces a probabilistic coding scheme to generate binary codes for multiple classes such that training errors can be efficiently reduced. Empirical studies with seven UCI datasets have indicated that the proposed boosting algorithm is more robust and effective than the AdaBoost.OC algorithm for multi-class learning.