Bayesian Multitask Classification With Gaussian Process Priors

  • Authors:
  • Grigorios Skolidis;Guido Sanguinetti

  • Affiliations:
  • School of Informatics, University of Edinburgh, Edinburgh, U.K.;School of Informatics, University of Edinburgh, Edinburgh, U.K.

  • Venue:
  • IEEE Transactions on Neural Networks - Part 1
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a novel approach to multitask learning in classification problems based on Gaussian process (GP) classification. The method extends previous work on multitask GP regression, constraining the overall covariance (across tasks and data points) to factorize as a Kronecker product. Fully Bayesian inference is possible but time consuming using sampling techniques. We propose approximations based on the popular variational Bayes and expectation propagation frameworks, showing that they both achieve excellent accuracy when compared to Gibbs sampling, in a fraction of time. We present results on a toy dataset and two real datasets, showing improved performance against the baseline results obtained by learning each task independently. We also compare with a recently proposed state-of-the-art approach based on support vector machines, obtaining comparable or better results.