Employing linear regression in regression tree leaves
ECAI '92 Proceedings of the 10th European conference on Artificial intelligence
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Recursive Automatic Bias Selection for Classifier Construction
Machine Learning - Special issue on bias evaluation and selection
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Machine Learning
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Functional Models for Regression Tree Leaves
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
IBERAMIA-SBIA '00 Proceedings of the International Joint Conference, 7th Ibero-American Conference on AI: Advances in Artificial Intelligence
Linear Machine Decision Trees
A system for induction of oblique decision trees
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
The design of algorithms that explore multiple representation languages and explore different search spaces has an intuitive appeal. In the context of classification problems, algorithms that generate multivariate trees are able to explore multiple representation languages by using decision tests based on a combination of attributes. The same applies to model trees algorithms, in regression domains, but using linear models at leaf nodes. In this paper we study where to use combinations of attributes in regression and classification tree learning. We present an algorithm for multivariate tree learning that combines a univariate decision tree with a linear function by means of constructive induction. This algorithm is able to use decision nodes with multivariate tests, and leaf nodes that make predictions using linear functions. Multivariate decision nodes are built when growing the tree, while functional leaves are built when pruning the tree. The algorithm has been implemented both for classification problems and regression problems. The experimental evaluation shows that our algorithm has clear advantages with respect to the generalization ability when compared against its components, two simplified versions, and competes well against the state-of-the-art in multivariate regression and classification trees.