Machine Learning - Special issue on inductive transfer
Error Estimators for Pruning Regression Trees
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Top-Down Induction of Clustering Trees
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Simultaneous Prediction of Mulriple Chemical Parameters of River Water Quality with TILDE
PKDD '99 Proceedings of the Third European Conference on Principles of Data Mining and Knowledge Discovery
Knowledge Discovery in Multi-label Phenotype Data
PKDD '01 Proceedings of the 5th European Conference on Principles of Data Mining and Knowledge Discovery
Inference for the Generalization Error
Machine Learning
Task clustering and gating for bayesian multitask learning
The Journal of Machine Learning Research
Why collective inference improves relational classification
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
Large Margin Methods for Structured and Interdependent Output Variables
The Journal of Machine Learning Research
Learning Gaussian processes from multiple tasks
ICML '05 Proceedings of the 22nd international conference on Machine learning
Multi-Task Learning for Classification with Dirichlet Process Priors
The Journal of Machine Learning Research
Learning from Relevant Tasks Only
ECML '07 Proceedings of the 18th European conference on Machine Learning
Decision trees for hierarchical multilabel classification: a case study in functional genomics
PKDD'06 Proceedings of the 10th European conference on Principle and Practice of Knowledge Discovery in Databases
Constraint based induction of multi-objective regression trees
KDID'05 Proceedings of the 4th international conference on Knowledge Discovery in Inductive Databases
Relevant subtask learning by constrained mixture models
Intelligent Data Analysis
Multi-target regression with rule ensembles
The Journal of Machine Learning Research
Hi-index | 0.01 |
We consider learning tasks where multiple target variables need to be predicted. Two approaches have been used in this setting: (a) build a separate single-target model for each target variable, and (b) build a multi-target model that predicts all targets simultaneously; the latter may exploit potential dependencies among the targets. For a given target, either (a) or (b) can yield the most accurate model. This shows that exploiting information available in other targets may be beneficial as well as detrimental to accuracy. This raises the question whether it is possible to find, for a given target (we call this the main target), the best subset of the other targets (the support targets) that, when combined with the main target in a multi-target model, results in the most accurate model for the main target. We propose Empirical Asymmetric Selective Transfer (EAST), a generally applicable algorithm that approximates such a subset. Applied to decision trees, EAST outperforms single-target decision trees, multi-target decision trees, and multi-target decision trees with target clustering.