Machine Learning - Special issue on inductive transfer
Learning Gaussian processes from multiple tasks
ICML '05 Proceedings of the 22nd international conference on Machine learning
Constructing informative priors using transfer learning
ICML '06 Proceedings of the 23rd international conference on Machine learning
Multi-Task Learning for Classification with Dirichlet Process Priors
The Journal of Machine Learning Research
Discriminative learning for differing training and test distributions
Proceedings of the 24th international conference on Machine learning
Domain adaptation with structural correspondence learning
EMNLP '06 Proceedings of the 2006 Conference on Empirical Methods in Natural Language Processing
Online methods for multi-domain learning and adaptation
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Learning common grammar from multilingual corpus
ACLShort '10 Proceedings of the ACL 2010 Conference Short Papers
N-best reranking by multitask learning
WMT '10 Proceedings of the Joint Fifth Workshop on Statistical Machine Translation and MetricsMATR
A case study on meta-generalising: a Gaussian processes approach
The Journal of Machine Learning Research
Relational Feature Mining with Hierarchical Multitask kFOIL
Fundamenta Informaticae - Machine Learning in Bioinformatics
Multi-domain learning: when do domains matter?
EMNLP-CoNLL '12 Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
Hi-index | 0.00 |
We learn multiple hypotheses for related tasks under a latent hierarchical relationship between tasks. We exploit the intuition that for domain adaptation, we wish to share classifier structure, but for multitask learning, we wish to share covariance structure. Our hierarchical model is seen to subsume several previously proposed multitask learning models and performs well on three distinct real-world data sets.