Variational cumulant expansions for intractable distributions

  • Authors:
  • David Barber;Piërre van de Laar

  • Affiliations:
  • Real World Computing Partnership, Theoretical Foundation SNN, Foundatoin for Neural Networks, University of Nijmegen, Nijmegen, The Netherlands;Real World Computing Partnership, Theoretical Foundation SNN, Foundatoin for Neural Networks, University of Nijmegen, Nijmegen, The Netherlands

  • Venue:
  • Journal of Artificial Intelligence Research
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

Intractable distributions present a common difficulty in inference within the probabilistic knowledge representation framework and variational methods have recently been popular in providing an approximate solution. In this article, we describe a perturbational approach in the form of a cumulant expansion which, to lowest order, recovers the standdard Kullback-Leibler variational bound. Higher-order terms describe corrections on the variational approach without incurring much further computational cost. The relationship to other perturbational approaches such as TAP is also elucidated. We demonstrate the method on a particular class of undirected graphical models, Boltzmann machines, for which our simulation results confirm improved accuracy and enhanced stability during learning.