Multiclass Alternating Decision Trees

  • Authors:
  • Geoffrey Holmes;Bernhard Pfahringer;Richard Kirkby;Eibe Frank;Mark Hall

  • Affiliations:
  • -;-;-;-;-

  • Venue:
  • ECML '02 Proceedings of the 13th European Conference on Machine Learning
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

The alternating decision tree (ADTree) is a successful classification technique that combines decision trees with the predictive accuracy of boosting into a set of interpretable classification rules. The original formulation of the tree induction algorithm restricted attention to binary classification problems. This paper empirically evaluates several wrapper methods for extending the algorithm to the multiclass case by splitting the problem into several two-class problems. Seeking a more natural solution we then adapt the multiclass LogitBoost and AdaBoost.MH procedures to induce alternating decision trees directly. Experimental results confirm that these procedures are comparable with wrapper methods that are based on the original ADTree formulation in accuracy, while inducing much smaller trees.