Bayesian networks for pattern classification, data compression, and channel coding
Bayesian networks for pattern classification, data compression, and channel coding
Loopy Belief Propagation: Convergence and Effects of Message Errors
The Journal of Machine Learning Research
Correctness of Local Probability Propagation in Graphical Models with Loops
Neural Computation
On the Convergence of Loopy Belief Propagation Algorithm for Different Update Rules
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Loopy belief propagation for approximate inference: an empirical study
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Loopy belief propagation and Gibbs measures
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Turbo decoding as an instance of Pearl's “belief propagation” algorithm
IEEE Journal on Selected Areas in Communications
Hi-index | 0.00 |
In this paper, we pursue application of Gibbs measure theory to LBP in two ways. First, we show this theory can be applied directly to LBP for factor graphs, where one can use higher-order potentials. Consequently, we show beliefs are just marginal probabilities for a certain Gibbs measure on a computation tree. We also give a convergence criterion using this tree. Second, to see the usefulness of this approach, we apply a well-known general condition and a special one, which are developed in Gibbs measure theory, to LBP. We compare these two criteria and another criterion derived by the best present result. Consequently, we show that the special condition is better than the others and also show the general condition is better than the best present result when the influence of one-body potentials is sufficiently large. These results surely encourage the use of Gibbs measure theory in this area.