C4.5: programs for machine learning
C4.5: programs for machine learning
Measuring the VC-dimension of a learning machine
Neural Computation
Machine Learning
The nature of statistical learning theory
The nature of statistical learning theory
Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods
Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey
Data Mining and Knowledge Discovery
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
Machine Learning
Information Sciences: an International Journal
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Decision trees using model ensemble-based nodes
Pattern Recognition
A Recursive Partitioning Decision Rule for Nonparametric Classification
IEEE Transactions on Computers
An experimental evaluation of simplicity in rule learning
Artificial Intelligence
Induction of multiple fuzzy decision trees based on rough set technique
Information Sciences: an International Journal
Building a cost-constrained decision tree with multiple condition attributes
Information Sciences: an International Journal
Moving towards efficient decision tree construction
Information Sciences: an International Journal
A system for induction of oblique decision trees
Journal of Artificial Intelligence Research
Improving generalization of fuzzy IF-THEN rules by maximizing fuzzy entropy
IEEE Transactions on Fuzzy Systems
Model selection in omnivariate decision trees
ECML'05 Proceedings of the 16th European conference on Machine Learning
Fuzzy decision trees: issues and methods
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Neural Networks
Classifiability-based omnivariate decision trees
IEEE Transactions on Neural Networks
Classification trees with neural network feature extraction
IEEE Transactions on Neural Networks
Hybrid Kansei-SOM model using risk management and company assessment for stock trading
Information Sciences: an International Journal
Hi-index | 0.07 |
As opposed to trees that use a single type of decision node, an omnivariate decision tree contains nodes of different types. We propose to use Structural Risk Minimization (SRM) to choose between node types in omnivariate decision tree construction to match the complexity of a node to the complexity of the data reaching that node. In order to apply SRM for model selection, one needs the VC-dimension of the candidate models. In this paper, we first derive the VC-dimension of the univariate model, and estimate the VC-dimension of all three models (univariate, linear multivariate or quadratic multivariate) experimentally. Second, we compare SRM with other model selection techniques including Akaike's Information Criterion (AIC), Bayesian Information Criterion (BIC) and cross-validation (CV) on standard datasets from the UCI and Delve repositories. We see that SRM induces omnivariate trees that have a small percentage of multivariate nodes close to the root and they generalize more or at least as accurately as those constructed using other model selection techniques.