Inferring decision trees using the minimum description length principle
Information and Computation
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Bottom-up induction of oblivious read-once decision graphs: strengths and limitations
AAAI '94 Proceedings of the twelfth national conference on Artificial intelligence (vol. 1)
MML clustering of multi-state, Poisson, vonMises circular and Gaussian distributions
Statistics and Computing
Boosting Using Branching Programs
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Decision forests with oblique decision trees
MICAI'06 Proceedings of the 5th Mexican international conference on Artificial Intelligence
MML inference of oblique decision trees
AI'04 Proceedings of the 17th Australian joint conference on Advances in Artificial Intelligence
Hi-index | 0.00 |
A decision tree is a comprehensible representation that has been widely used in many machine learning domains. But in the area of supervised learning, decision trees have their limitations. Two notable problems are those of replication and fragmentation. One way of solving these problems is to introduce decision graphs, a generalization of the decision tree, which address the above problems by allowing for disjunctions, or joins. While various decision graph systems are available, all of these systems impose some forms of restriction on the proposed representations, often leading to either a new redundancy or the original redundancy not being removed. In this paper, we propose an unrestricted representation called the decision graph with multiway joins, which has improved representative power and is able to use training data efficiently. An algorithm to infer these decision graphs with multi-way joins using the Minimum Message Length (MML) principle is also introduced. On both real-world and artificial data with only discrete attributes (including at least five UCI data-sets), and in terms of both "right"/"wrong" classification accuracy and I.J. Good's logarithm of probability "bit-costing" predictive accuracy, our novel multi-way join decision graph program significantly out-performs both C4.5 and C5.0. Our program also out-performs the Oliver and Wallace binary join decision graph program on the only data-set available for comparison.