Error bounds between marginal probabilities and beliefs of loopy belief propagation algorithm

  • Authors:
  • Nobuyuki Taga;Shigeru Mase

  • Affiliations:
  • Tokyo Institute of Technology, Tokyo, Japan;Tokyo Institute of Technology, Tokyo, Japan

  • Venue:
  • MICAI'06 Proceedings of the 5th Mexican international conference on Artificial Intelligence
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Belief propagation (BP) algorithm has been becoming increasingly a popular method for probabilistic inference on general graphical models. When networks have loops, it may not converge and, even if converges, beliefs, i.e., the result of the algorithm, may not be equal to exact marginal probabilities. When networks have loops, the algorithm is called Loopy BP (LBP). Tatikonda and Jordan applied Gibbs measures theory to LBP algorithm and derived a sufficient convergence condition. In this paper, we utilize Gibbs measure theory to investigate the discrepancy between a marginal probability and the corresponding belief. Consequently, in particular, we obtain an error bound if the algorithm converges under a certain condition. It is a general result for the accuracy of the algorithm. We also perform numerical experiments to see the effectiveness of the result.