Variational Bayesian inference for a nonlinear forward model

  • Authors:
  • Michael A. Chappell;Adrian R. Groves;Brandon Whitcher;Mark W. Woolrich

  • Affiliations:
  • FMRIB Centre, University of Oxford, John Radcliffe Hospital, Headington, Oxford, U.K.;FMRIB Centre, University of Oxford, John Radcliffe Hospital, Headington, Oxford, U.K.;Clinical lmaging Centre, GlaxoSmithKline, Imperial College London, Hammersmith Hosptial, London, U.K.;FMRIB Centre, University of Oxford, John Radcliffe Hospital, Headington, Oxford, U.K.

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 35.69

Visualization

Abstract

Variational Bayes (VB) has been proposed as a method to facilitate calculations of the posterior distributions for linear models, by providing a fast method for Bayesian inference by estimating the parameters of a factorized approximation to the posterior distribution. Here a VB method for nonlinear forward models with Gaussian additive noise is presented. In the case of noninformative priors the parameter estimates obtained from this VB approach are identical to those found via nonlinear least squares. However, the advantage of the VB method lies in its Bayesian formulation, which permits prior information to be included in a hierarchical structure and measures of uncertainty for all parameter estimates to be obtained via the posterior distribution. Unlike other Bayesian methods VB is only approximate in comparison with the sampling method of MCMC. However, the VB method is found to be comparable and the assumptions made about the form of the posterior distribution reasonable. Practically, the VB approach is substantially faster than MCMC as fewer calculations are required. Some of the advantages of the fully Bayesian nature of the method are demonstrated through the extension of the noise model and the inclusion of Automatic Relevance Determination (ARD) within the VB algorithm.