Belief Propagation and Revision in Networks with Loops
Belief Propagation and Revision in Networks with Loops
Stochastic reasoning, free energy, and information geometry
Neural Computation
On the uniqueness of loopy belief propagation fixed points
Neural Computation
Correctness of Local Probability Propagation in Graphical Models with Loops
Neural Computation
Theoretical analysis of accuracy of Gaussian belief propagation
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Information geometry of turbo and low-density parity-check codes
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Belief propagation (BP) is effective for computing marginal probabilities of a high dimensional probability distribution. Loopy belief propagation (LBP) is known not to compute precise marginal probabilities and not to guarantee its convergence. The fixed points of LBP are known to accord with the extrema of Bethe free energy. Hence, the fixed points are analyzed by minimizing the Bethe free energy.In this paper, we consider the Bethe free energy in Gaussian distributions and analytically clarify the extrema, equivalently, the fixed points of LBP for some particular cases. The analytical results tell us a necessary condition for LBP convergence and the quantities which determine the accuracy of LBP in Gaussian distributions. Based on the analytical results, we perform numerical experiments of LBP and compare the results with analytical solutions.