A Bayesian/Information Theoretic Model of Learning to Learn viaMultiple Task Sampling
Machine Learning - Special issue on inductive transfer
Machine Learning - Special issue on inductive transfer
Task clustering and gating for bayesian multitask learning
The Journal of Machine Learning Research
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
How boosting the margin can also boost classifier complexity
ICML '06 Proceedings of the 23rd international conference on Machine learning
Multi-Task Learning for Classification with Dirichlet Process Priors
The Journal of Machine Learning Research
Boosting for transfer learning
Proceedings of the 24th international conference on Machine learning
Convex multi-task feature learning
Machine Learning
Boosting with structural sparsity
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Multi-task learning for boosting with application to web search ranking
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Collaborative boosting for activity classification in microblogs
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Hi-index | 0.00 |
Multi-task learning aims at improving the performance of one learning task with the help of other related tasks. It is particularly useful when each task has very limited labeled data. A central issue in multi-task learning is to learn and exploit the relationships between tasks. In this paper, we generalize boosting to the multi-task learning setting and propose a method called multi-task boosting (MTBoost). Different tasks in MTBoost share the same base learners but with different weights which are related to the estimated task relationships in each iteration. In MTBoost, unlike ordinary boosting methods, the base learners, weights and task covariances are learned together in an integrated fashion using an alternating optimization procedure. We conduct theoretical analysis on the convergence of MTBoost and also empirical analysis comparing it with several related methods.