Knowledge extraction through learning from examples
Machine learning: a guide to current research
Information Processing Letters
Induction of decision trees from inconclusive data
Proceedings of the sixth international workshop on Machine learning
Machine Learning
Generating production rules from decision trees
IJCAI'87 Proceedings of the 10th international joint conference on Artificial intelligence - Volume 1
Controlled Flux Results in Stable Decision Trees
ICTAI '99 Proceedings of the 11th IEEE International Conference on Tools with Artificial Intelligence
Information Sciences: an International Journal
Pareto-optimality of oblique decision trees from evolutionary algorithms
Journal of Global Optimization
Generalization behaviour of alkemic decision trees
ILP'05 Proceedings of the 15th international conference on Inductive Logic Programming
Hierarchical linear support vector machine
Pattern Recognition
Hi-index | 0.00 |
In this paper, we address the issue of evaluating decision trees generated from training examples by a learning algorithm. We give a set of performance measures and show how some of them relate to others. We derive results suggesting that the number of leaves in a decision tree is the important measure to minimize. Minimizing this measure will, in a probabilistic sense, improve performance along the other measures. Notably it is expected to produce trees whose error rates are less likely to exceed some acceptable limit. The motivation for deriving such results is two-fold: 1. To better understand what constitutes a good measure of performance, and 2. To provide guidance when deciding which aspects of a decision tree generation algorithm should be changed in order to improve the quality of the decision trees it generates. The results presented in this paper can be used as a basis for a methodology for formally proving that one decision tree generation algorithm is better than another. This would provide a more satisfactory alternative to the current empirical evaluation method for comparing algorithms.