Machine Learning - Special issue on inductive transfer
Empirical Bayes for Learning to Learn
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Classes of kernels for machine learning: a statistics perspective
The Journal of Machine Learning Research
Task clustering and gating for bayesian multitask learning
The Journal of Machine Learning Research
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
A Unifying View of Sparse Approximate Gaussian Process Regression
The Journal of Machine Learning Research
Multi-Task Learning for Classification with Dirichlet Process Priors
The Journal of Machine Learning Research
Characterizing the Function Space for Bayesian Kernel Models
The Journal of Machine Learning Research
IPSN '08 Proceedings of the 7th international conference on Information processing in sensor networks
Gaussian process modelling of latent chemical species
Bioinformatics
Kernels for Vector-Valued Functions
Kernels for Vector-Valued Functions
Sparse gaussian processes for multi-task learning
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part I
Hi-index | 0.00 |
Recently there has been an increasing interest in regression methods that deal with multiple outputs. This has been motivated partly by frameworks like multitask learning, multisensor networks or structured output data. From a Gaussian processes perspective, the problem reduces to specifying an appropriate covariance function that, whilst being positive semi-definite, captures the dependencies between all the data points and across all the outputs. One approach to account for non-trivial correlations between outputs employs convolution processes. Under a latent function interpretation of the convolution transform we establish dependencies between output variables. The main drawbacks of this approach are the associated computational and storage demands. In this paper we address these issues. We present different efficient approximations for dependent output Gaussian processes constructed through the convolution formalism. We exploit the conditional independencies present naturally in the model. This leads to a form of the covariance similar in spirit to the so called PITC and FITC approximations for a single output. We show experimental results with synthetic and real data, in particular, we show results in school exams score prediction, pollution prediction and gene expression data.