A methodology to explain neural network classification
Neural Networks
Two timescale analysis of the Alopex algorithm for optimization
Neural Computation
Induction of multiclass multifeature split decision trees from distributed data
Pattern Recognition
Towards the automatic design of decision tree induction algorithms
Proceedings of the 13th annual conference companion on Genetic and evolutionary computation
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
We present methods for learning and pruning oblique decision trees. We propose a new function for evaluating different split rules at each node while growing the decision tree. Unlike the other evaluation functions currently used in the literature (which are all based on some notion of purity of a node), this new evaluation function is based on the concept of degree of linear separability. We adopt a correlation based optimization technique called the Alopex algorithm (K.P. Unnikrishnaan and K.P. Venugopal, 1994) for finding the split rule that optimizes our evaluation function at each node. The algorithm we present is applicable only for 2-class problems. Through empirical studies, we demonstrate that our algorithm learns good compact decision trees. We suggest a representation scheme for oblique decision trees that makes explicit the fact that an oblique decision tree represents each class as a union of convex sets bounded by hyperplanes in the feature space. Using this representation, we present a new pruning technique. Unlike other pruning techniques, which generally replace heuristically selected subtrees of the original tree by leaves, our method can radically restructure the decision tree. Through empirical investigation, we demonstrate the effectiveness of our method