Sparse on-line Gaussian processes

  • Authors:
  • Lehel Csató;Manfred Opper

  • Affiliations:
  • Neural Computing Research Group, Department of Information Engineering, Aston University, B4 7ET Birmingham, U.K.;Neural Computing Research Group, Department of Information Engineering, Aston University, B4 7ET Birmingham, U.K.

  • Venue:
  • Neural Computation
  • Year:
  • 2002

Quantified Score

Hi-index 0.01

Visualization

Abstract

We develop an approach for sparse representations of gaussian process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The method is based on a Combination of a Bayesian on-line algorithm, together with a sequential construction of a relevant subsample of the data that fully specifies the prediction of the GP model. By using an appealing parameterization and projection techniques in a reproducing kernel Hilbert space, recursions for the effective parameters and a sparse gaussian approximation of the posterior process are obtained. This allows for both a propagation of predictions and Bayesian error measures. The significance and robustness of our approach are demonstrated on a variety of experiments.