Learning decision rules in noisy domains
Proceedings of Expert Systems '86, The 6Th Annual Technical Conference on Research and development in expert systems III
International Journal of Man-Machine Studies - Special Issue: Knowledge Acquisition for Knowledge-based Systems. Part 5
A Comparative Analysis of Methods for Pruning Decision Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
On Estimating Probabilities in Tree Pruning
EWSL '91 Proceedings of the European Working Session on Machine Learning
Functional Models for Regression Tree Leaves
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Trading-Off Local versus Global Effects of Regression Nodes in Model Trees
ISMIS '02 Proceedings of the 13th International Symposium on Foundations of Intelligent Systems
An analysis of reduced error pruning
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
Model trees are tree-based regression models that associate leaves with linear regression models. A new method for the stepwise induction of model trees (SMOTI) has been developed. Its main characteristic is the construction of trees with two types of nodes: regression nodes, which perform only straight-line regression, and splitting nodes, which partition the feature space. In this way, internal regression nodes contribute to the definition of multiple linear models and have a "global" effect, while straight-line regressions at leaves have only "local" effects. In this paper the problem of simplifying model trees with both regression and splitting nodes is faced. In particular two methods, named Reduced Error Pruning (REP) and Reduced Error Grafting (REG), are proposed. They are characterized by the use of an independent pruning set. The effect of the simplification on model trees induced with SMOTI is empirically investigated. Results are in favour of simplified trees in most cases.