A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting interval based literals
Intelligent Data Analysis
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
Improvements to adaboost dynamic
Canadian AI'12 Proceedings of the 25th Canadian conference on Advances in Artificial Intelligence
Hi-index | 0.00 |
This paper introduces AdaBoost Dynamic, an extension of AdaBoost.M1 algorithm by Freund and Shapire. In this extension we use different "weak" classifiers in subsequent iterations of the algorithm, instead of AdaBoost's fixed base classifier. The algorithm is tested with various datasets from UCI database, and results show that the algorithm performs equally well as AdaBoost with the best possible base learner for a given dataset. This result therefore relieves a machine learning analyst from having to decide which base classifier to use.