Inferring decision trees using the minimum description length principle
Information and Computation
C4.5: programs for machine learning
C4.5: programs for machine learning
A geometric framework for machine learning
A geometric framework for machine learning
Machine Learning
Distance Metrics for Instance-Bsed Learning
ISMIS '91 Proceedings of the 6th International Symposium on Methodologies for Intelligent Systems
Linear Machine Decision Trees
IJCAI'89 Proceedings of the 11th international joint conference on Artificial intelligence - Volume 1
Towards the automatic design of decision tree induction algorithms
Proceedings of the 13th annual conference companion on Genetic and evolutionary computation
Model trees for classification of hybrid data types
IDEAL'05 Proceedings of the 6th international conference on Intelligent Data Engineering and Automated Learning
Automatic creation of links: an approach based on decision tree
FSKD'05 Proceedings of the Second international conference on Fuzzy Systems and Knowledge Discovery - Volume Part II
From Optimal Hyperplanes to Optimal Decision Trees
Fundamenta Informaticae
Hi-index | 0.00 |
This paper introduces OC1, a new algorithm for generating multivariate decision trees. Multivariate trees classify examples by testing linear combinations of the features at each non-leaf node of the tree. Each test is equivalent to a hyperplane at an oblique orientation to the axes. Because of the computational intractability of finding an optimal orientation for these hyperplanes, heurist.ic methods must be used to produce good trees. This paper explores a new method that combines deterministic and randomized procedures to search for a good tree. Experiments on several different real-world data sets demonstrate that the method consistently finds much smaller trees than comparable methods using univariate tests. In addition, the accuracy of the trees found with our method matches or exceeds the best results of other machine learning methods.