Bayesian Classification With Gaussian Processes

  • Authors:
  • Christopher K. I. Williams;David Barber

  • Affiliations:
  • Univ. of Edinburgh, Edinburgh, Scotland, UK;Univ. of Nijmegen, Nijmegen, The Netherlands

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 1998

Quantified Score

Hi-index 0.15

Visualization

Abstract

We consider the problem of assigning an input vector to one of m classes by predicting P(c|${\schmi x}$) for c = 1, ..., m. For a two-class problem, the probability of class one given ${\schmi x}$ is estimated by 驴(y(${\schmi x}$)), where 驴(y) = 1/(1 + e驴y). A Gaussian process prior is placed on y(${\schmi x}$), and is combined with the training data to obtain predictions for new ${\schmi x}$ points. We provide a Bayesian treatment, integrating over uncertainty in y and in the parameters that control the Gaussian process prior; the necessary integration over y is carried out using Laplace's approximation. The method is generalized to multiclass problems (m 2) using the softmax function. We demonstrate the effectiveness of the method on a number of datasets.