Boolean Feature Discovery in Empirical Learning
Machine Learning
An incremental method for finding multivariate splits for decision trees
Proceedings of the seventh international conference (1990) on Machine learning
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Symbolic and Neural Learning Algorithms: An Experimental Comparison
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Extracting Refined Rules from Knowledge-Based Neural Networks
Machine Learning
Hypothesis-Driven Constructive Induction in AQ17-HCI: A Method and Experiments
Machine Learning - Special issue on evaluating and changing representation
Machine Learning
A penalty-function approach for pruning feedforward neural networks
Neural Computation
Neural Networks in Computer Intelligence
Neural Networks in Computer Intelligence
Machine Learning
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Global Data Analysis and the Fragmentation Problem in Decision Tree Induction
ECML '97 Proceedings of the 9th European Conference on Machine Learning
Chi2: Feature Selection and Discretization of Numeric Attributes
TAI '95 Proceedings of the Seventh International Conference on Tools with Artificial Intelligence
Constructive induction on decision trees
IJCAI'89 Proceedings of the 11th international joint conference on Artificial intelligence - Volume 1
Understanding neural networks via rule extraction
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Feature Extraction for the k-Nearest Neighbour Classifier with Genetic Programming
EuroGP '01 Proceedings of the 4th European Conference on Genetic Programming
Flexible neural tree for pattern recognition
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
Univariate decision trees (UDT's) have inherent problems of replication, repetition, and fragmentation. Multivariate decision trees (MDT's) have been proposed to overcome some of the problems. Close examination of the conventional ways of building MDT's, however, reveals that the fragmentation problem still persists. A novel approach is suggested to minimize the fragmentation problem by separating hyperplane search from decision tree building. This is achieved by feature transformation. Let the initial feature vector be x, the new feature vector after feature transformation T is y, i.e., y = T(x). We can obtain an MDTb y (1) building a UDT on y; and (2) replacing new features y at each node with the combinations of initial features x. We elaborate on the advantages of this approach, the details of T, and why it is expected to perform well. Experiments are conducted in order to confirm the analysis, and results are compared to those of C4.5, OC1, and CART.