Ranking cases with decision trees: a geometric method that preserves intelligibility

  • Authors:
  • Isabelle Alvarez;Stephan Bernard

  • Affiliations:
  • LIP6, Paris VI University, Paris, France and Cemagref, LISC, Aubiere Cedex, France;Cemagref, LISC, Aubiere Cedex, France

  • Venue:
  • IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a new method to rank the cases classified by a decision tree. The method applies a posteriori without modification of the tree and doesn't use additional training cases. It consists in computing the distance of the cases to the decision boundary induced by the decision tree, and to rank them according to this geometric score. When the data are numeric it is very easy to implement and efficient. The distance-based score is a global assess, contrary to other methods that evaluate the score at the level of the leaf. The distance-based score gives good results even with pruned tree, so if the tree is intelligible this property is preserved with an improved ranking ability. The main reason for the efficacity of the geometric method is that in most cases when the classifier is sufficiently accurate, errors are located near the decision boundary.