Improved AdaBoost.M1 of decision trees with confidence-rated predictions

  • Authors:
  • Zhipeng Xie

  • Affiliations:
  • Fudan University, Shanghai

  • Venue:
  • Proceedings of the 2009 ACM symposium on Applied Computing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes an algorithm to integrate AdaBoost.M1 with the decision trees that output confidence-rated predictions, which is done by transforming decision trees from "expert" models to "specialist" models that may abstain when the confidence is less than 1/2. The confidence is used to update the instance weights during the boosting process, and it is also used to determine the vote weights of base classifiers in decision process. This makes the algorithm a "dynamic" one, in that: (1) for a given test instance, only those whose confidences are higher than 1/2 can vote on the decision making; and (2) the vote weight of each base classifier is dependent on the confidence that the classifier has on the target instance. Experimental results with C4.5 decision tree learner as the base learning algorithm have shown that this algorithm has significantly outperformed both the base algorithm and the AdaBoost.M1 of the C4.5 decision trees with simple predictions.