C4.5: programs for machine learning
C4.5: programs for machine learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Hi-index | 0.00 |
This paper proposes an algorithm to integrate AdaBoost.M1 with the decision trees that output confidence-rated predictions, which is done by transforming decision trees from "expert" models to "specialist" models that may abstain when the confidence is less than 1/2. The confidence is used to update the instance weights during the boosting process, and it is also used to determine the vote weights of base classifiers in decision process. This makes the algorithm a "dynamic" one, in that: (1) for a given test instance, only those whose confidences are higher than 1/2 can vote on the decision making; and (2) the vote weight of each base classifier is dependent on the confidence that the classifier has on the target instance. Experimental results with C4.5 decision tree learner as the base learning algorithm have shown that this algorithm has significantly outperformed both the base algorithm and the AdaBoost.M1 of the C4.5 decision trees with simple predictions.