The nature of statistical learning theory
The nature of statistical learning theory
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Multicategory Classification by Support Vector Machines
Computational Optimization and Applications - Special issue on computational optimization—a tribute to Olvi Mangasarian, part I
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Proceedings of the 1998 conference on Advances in neural information processing systems II
Machine Learning
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
An Adaptive Version of the Boost by Majority Algorithm
Machine Learning
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Using Error-Correcting Codes for Text Classification
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
Smooth boosting and learning with malicious noise
The Journal of Machine Learning Research
Effectiveness of error correcting output coding methods in ensemble and monolithic learning machines
Pattern Analysis & Applications
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
RotBoost: A technique for combining Rotation Forest and AdaBoost
Pattern Recognition Letters
Semi-Supervised Boosting for Multi-Class Classification
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Coarse-to-fine multiclass learning and classification for time-critical domains
Pattern Recognition Letters
Boosting for multiclass semi-supervised learning
Pattern Recognition Letters
Hi-index | 0.00 |
AdaBoost.OC has been shown to be an effective method in boosting "weak" binary classifiers for multi-class learning. It employs the Error-Correcting Output Code (ECOC) method to convert a multi-class learning problem into a set of binary classification problems, and applies the AdaBoost algorithm to solve them efficiently. One of the main drawbacks with the AdaBoost.OC algorithm is that it is sensitive to the noisy examples and tends to overfit training examples when they are noisy. In this paper, we propose a new boosting algorithm, named "MSmoothBoost", which introduces a smoothing mechanism into the boosting procedure to explicitly address the overfitting problem with AdaBoost.OC. We proved the bounds for both the empirical training error and the marginal training error of the proposed boosting algorithm. Empirical studies with seven UCI datasets and one real-world application have indicated that the proposed boosting algorithm is more robust and effective than the AdaBoost.OC algorithm for multi-class learning.