Efficient learning in Boltzmann machines using linear response theory
Neural Computation
A revolution: belief propagation in graphs with cycles
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Approximate inference in Boltzmann machines
Artificial Intelligence
Expectation Propagation for approximate Bayesian inference
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
Correctness of Local Probability Propagation in Graphical Models with Loops
Neural Computation
Loopy belief propagation for approximate inference: an empirical study
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
A new class of upper bounds on the log partition function
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
On the optimality of solutions of the max-product belief-propagation algorithm in arbitrary graphs
IEEE Transactions on Information Theory
Approximate learning algorithm in boltzmann machines
Neural Computation
Hi-index | 0.00 |
Belief propagation (BP) on cyclic graphs is an efficient algorithm for computing approximate marginal probability distributions over single nodes and neighboring nodes in the graph. However, it does not prescribe a way to compute joint distributions over pairs of distant nodes in the graph. In this article, we propose two new algorithms for approximating these pairwise probabilities, based on the linear response theorem. The first is a propagation algorithm that is shown to converge if BP converges to a stable fixed point. The second algorithm is based on matrix inversion. Applying these ideas to gaussian random fields, we derive a propagation algorithm for computing the inverse of a matrix.