Communications of the ACM
Information Processing Letters
Inferring decision trees using the minimum description length principle
Information and Computation
Compression, significance and accuracy
ML92 Proceedings of the ninth international workshop on Machine learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Machine Learning
Creating a Memory of Causal Relationships: An Integration of Empirical and Explanation-Based Learning Methods
Machine Learning
The Lumberjack Algorithm for Learning Linked Decision Forests
SARA '02 Proceedings of the 4th International Symposium on Abstraction, Reformulation, and Approximation
The lumberjack algorithm for learning linked decision forests
PRICAI'00 Proceedings of the 6th Pacific Rim international conference on Artificial intelligence
SETN'06 Proceedings of the 4th Helenic conference on Advances in Artificial Intelligence
Qualitative inference in possibilistic option decision trees
ECSQARU'05 Proceedings of the 8th European conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Hi-index | 0.00 |
We report on a series of experiments in which all decision trees consistent with the training data are constructed. These experiments were run to gain an understanding of the properties of the set of consistent decision trees and the factors that affect the accuracy of individual trees. In particular, we investigated the relationship between the size of a decision tree consistent with some training data and the accuracy of the tree on test data. The experiments were performed on a massively parallel Maspar computer. The results of the experiments on several artificial and two real world problems indicate that, for many of the problems investigated, smaller consistent decision trees are on average less accurate than the average accuracy of slightly larger trees.