Employing linear regression in regression tree leaves
ECAI '92 Proceedings of the 10th European conference on Artificial intelligence
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Machine Learning - special issue on inductive logic programming
Machine Learning - Special issue on inductive transfer
Solving regression problems with rule-based ensemble classifiers
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Machine Learning
Predicting Chemical Parameters of River Water Quality from Bioindicator Data
Applied Intelligence
Top-Down Induction of Clustering Trees
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Bloomy Decision Tree for Multi-objective Classification
PKDD '01 Proceedings of the 5th European Conference on Principles of Data Mining and Knowledge Discovery
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Intelligent data analysis
Efficient algorithms for decision tree cross-validation
The Journal of Machine Learning Research
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Multi-task learning for HIV therapy screening
Proceedings of the 25th international conference on Machine learning
Maximum likelihood rule ensembles
Proceedings of the 25th international conference on Machine learning
Stepwise Induction of Multi-target Model Trees
ECML '07 Proceedings of the 18th European conference on Machine Learning
Ensembles of Multi-Objective Decision Trees
ECML '07 Proceedings of the 18th European conference on Machine Learning
Solving Regression by Learning an Ensemble of Decision Rules
ICAISC '08 Proceedings of the 9th international conference on Artificial Intelligence and Soft Computing
Convex multi-task feature learning
Machine Learning
Empirical Asymmetric Selective Transfer in Multi-objective Decision Trees
DS '08 Proceedings of the 11th International Conference on Discovery Science
Multi-domain spoken language understanding with transfer learning
Speech Communication
Rule Ensembles for Multi-target Regression
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
Learning classification rules for multiple target attributes
PAKDD'08 Proceedings of the 12th Pacific-Asia conference on Advances in knowledge discovery and data mining
The Journal of Machine Learning Research
Machine Learning
Constraint based induction of multi-objective regression trees
KDID'05 Proceedings of the 4th international conference on Knowledge Discovery in Inductive Databases
Penalty for Sparse Linear and Sparse Multiple Kernel Multitask Learning
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Methods for learning decision rules are being successfully applied to many problem domains, in particular when understanding and interpretation of the learned model is necessary. In many real life problems, we would like to predict multiple related (nominal or numeric) target attributes simultaneously. While several methods for learning rules that predict multiple targets at once exist, they are all based on the covering algorithm, which does not work well for regression problems. A better solution for regression is the rule ensemble approach that transcribes an ensemble of decision trees into a large collection of rules. An optimization procedure is then used to select the best (and much smaller) subset of these rules and to determine their respective weights. We introduce the FIRE algorithm for solving multi-target regression problems, which employs the rule ensembles approach. We improve the accuracy of the algorithm by adding simple linear functions to the ensemble. We also extensively evaluate the algorithm with and without linear functions. The results show that the accuracy of multi-target regression rule ensembles is high. They are more accurate than, for instance, multi-target regression trees, but not quite as accurate as multi-target random forests. The rule ensembles are significantly more concise than random forests, and it is also possible to create compact rule sets that are smaller than a single regression tree but still comparable in accuracy.