Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
On the convergence rate of dual ascent methods for linearly constrained convex minimization
Mathematics of Operations Research
Dual coordinate ascent methods for non-strictly convex minimization
Mathematical Programming: Series A and B
Finding MAPs for belief networks is NP-hard
Artificial Intelligence
Graphical models for machine learning and digital communication
Graphical models for machine learning and digital communication
Learning in graphical models
International Journal of Computer Vision - Special issue on statistical and computational theories of vision: modeling, learning, sampling and computing, Part I
Comparison of Graph Cuts with Belief Propagation for Stereo, using Identical MRF Parameters
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Learning and Inferring Image Segmentations using the GBP Typical Cut Algorithm
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
On the uniqueness of loopy belief propagation fixed points
Neural Computation
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
Correctness of Local Probability Propagation in Graphical Models with Loops
Neural Computation
Convergent Tree-Reweighted Message Passing for Energy Minimization
IEEE Transactions on Pattern Analysis and Machine Intelligence
Linear Programming Relaxations and Belief Propagation -- An Empirical Study
The Journal of Machine Learning Research
A Linear Programming Approach to Max-Sum Problem: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Proceedings of the 25th international conference on Machine learning
A coordinate gradient descent method for nonsmooth separable minimization
Mathematical Programming: Series A and B
Beyond Loose LP-Relaxations: Optimizing MRFs by Repairing Cycles
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part III
Convexity arguments for efficient minimization of the Bethe and Kikuchi free energies
Journal of Artificial Intelligence Research
Variational probabilistic inference and the QMR-DT network
Journal of Artificial Intelligence Research
Convergent message passing algorithms: a unifying view
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Convexifying the Bethe free energy
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
libDAI: A Free and Open Source C++ Library for Discrete Approximate Inference in Graphical Models
The Journal of Machine Learning Research
MRF Energy Minimization and Beyond via Dual Decomposition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Loopy belief propagation and Gibbs measures
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
Constructing free-energy approximations and generalized belief propagation algorithms
IEEE Transactions on Information Theory
A new class of upper bounds on the log partition function
IEEE Transactions on Information Theory
MAP estimation via agreement on trees: message-passing and linear programming
IEEE Transactions on Information Theory
The partial constraint satisfaction problem: Facets and lifting theorems
Operations Research Letters
An alternating direction method for dual MAP LP relaxation
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part II
Continuous markov random fields for robust stereo estimation
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part V
Efficient exact inference for 3d indoor scene understanding
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part VI
Energy distribution view for monotonic dual decomposition
International Journal of Approximate Reasoning
Message-passing algorithms for quadratic minimization
The Journal of Machine Learning Research
Variational algorithms for marginal MAP
The Journal of Machine Learning Research
Hi-index | 754.84 |
Inference problems in graphical models can be represented as a constrained optimization of a free-energy function. In this paper, we treat both forms of probabilistic inference, estimating marginal probabilities of the joint distribution and finding the most probable assignment, through a unified message-passing algorithm architecture. In particular we generalize the belief propagation (BP) algorithms of sum-product and max-product and tree-reweighted (TRW) sum and max product algorithms (TRBP) and introduce a new set of convergent algorithms based on "convex-free-energy" and linear-programming (LP) relaxation as a zero-temperature of a convex-free-energy. The main idea of this work arises from taking a general perspective on the existing BP and TRBP algorithms while observing that they all are reductions from the basic optimization formula of f + Σi hi where the function f is an extended-valued, strictly convex but nonsmooth and the functions hi are extended-valued functions (not necessarily convex). We use tools from convex duality to present the "primal-dual ascent" algorithm which is an extension of the Bregman successive projection scheme and is designed to handle optimization of the general type f + Σi hi. We then map the fractional-free-energy variational principle for approximate inference onto the optimization formula above and introduce the "norm-product" message-passing algorithm. Special cases of the norm-product include sum-product and max-product (BP algorithms), TRBP and NMPLP algorithms. When the fractional-free-energy is set to be convex (convex-free-energy) the norm-product is globally convergent for the estimation of marginal probabilities and for approximating the LP-relaxation. We also introduce another branch of the norm-product which arises as the "zero-temperature" of the convex-free-energy which we refer to as the "convex-max-product". The convex-max-product is convergent (unlike max-product) and aims at solving the LP-relaxation.