C4.5: programs for machine learning
C4.5: programs for machine learning
A Comparative Analysis of Methods for Pruning Decision Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey
Data Mining and Knowledge Discovery
Obtaining calibrated probability estimates from decision trees and naive Bayesian classifiers
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Tree Induction for Probability-Based Ranking
Machine Learning
Describing the Result of a Classifier to the End-User: Geometric-based Sensitivity
Proceedings of the 2010 conference on ECAI 2010: 19th European Conference on Artificial Intelligence
Hi-index | 0.00 |
This paper proposes a new method to rank the cases classified by a decision tree. The method applies a posteriori without modification of the tree and doesn't use additional training cases. It consists in computing the distance of the cases to the decision boundary induced by the decision tree, and to rank them according to this geometric score. When the data are numeric it is very easy to implement and efficient. The distance-based score is a global assess, contrary to other methods that evaluate the score at the level of the leaf. The distance-based score gives good results even with pruned tree, so if the tree is intelligible this property is preserved with an improved ranking ability. The main reason for the efficacity of the geometric method is that in most cases when the classifier is sufficiently accurate, errors are located near the decision boundary.