Machine Learning - Special issue on inductive transfer
Learning to learn
A nonparametric hierarchical bayesian framework for information filtering
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Multi-Task Learning for Classification with Dirichlet Process Priors
The Journal of Machine Learning Research
Large margin transductive transfer learning
Proceedings of the 18th ACM conference on Information and knowledge management
A novel learning approach to multiple tasks based on boosting methodology
Pattern Recognition Letters
Distance Dependent Chinese Restaurant Processes
The Journal of Machine Learning Research
Fast multi-task learning for query spelling correction
Proceedings of the 21st ACM international conference on Information and knowledge management
Variational inference in nonconjugate models
The Journal of Machine Learning Research
Analysis of space-time relational data with application to legislative voting
Computational Statistics & Data Analysis
Hi-index | 0.00 |
In multi-task learning our goal is to design regression or classification models for each of the tasks and appropriately share information between tasks. A Dirichlet process (DP) prior can be used to encourage task clustering. However, the DP prior does not allow local clustering of tasks with respect to a subset of the feature vector without making independence assumptions. Motivated by this problem, we develop a new multitask-learning prior, termed the matrix stick-breaking process (MSBP), which encourages cross-task sharing of data. However, the MSBP allows separate clustering and borrowing of information for the different feature components. This is important when tasks are more closely related for certain features than for others. Bayesian inference proceeds by a Gibbs sampling algorithm and the approach is illustrated using a simulated example and a multi-national application.