Randomized rounding: a technique for provably good algorithms and algorithmic proofs
Combinatorica - Theory of Computing
A successive projection method
Mathematical Programming: Series A and B
On the convergence of the exponential multiplier method for convex programming
Mathematical Programming: Series A and B
Convergence rate analysis of nonquadratic proximal methods for convex and linear programming
Mathematics of Operations Research
Fast Approximate Energy Minimization via Graph Cuts
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to Linear Optimization
Introduction to Linear Optimization
Parallel and Distributed Computation: Numerical Methods
Parallel and Distributed Computation: Numerical Methods
Parallel Optimization: Theory, Algorithms and Applications
Parallel Optimization: Theory, Algorithms and Applications
Nonserial Dynamic Programming
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Tree consistency and bounds on the performance of the max-product algorithm and its generalizations
Statistics and Computing
Convex Optimization
A Linear Programming Formulation and Approximation Algorithms for the Metric Labeling Problem
SIAM Journal on Discrete Mathematics
Quadratic programming relaxations for metric labeling and Markov random field MAP estimation
ICML '06 Proceedings of the 23rd international conference on Machine learning
Solving Markov Random Fields using Second Order Cone Programming Relaxations
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
Convergent Tree-Reweighted Message Passing for Energy Minimization
IEEE Transactions on Pattern Analysis and Machine Intelligence
The rate of convergence for the cyclic projections algorithm I: angles between convex sets
Journal of Approximation Theory
Linear Programming Relaxations and Belief Propagation -- An Empirical Study
The Journal of Machine Learning Research
A new class of upper bounds on the log partition function
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
On the optimality of solutions of the max-product belief-propagation algorithm in arbitrary graphs
IEEE Transactions on Information Theory
Tree-based reparameterization framework for analysis of sum-product and related algorithms
IEEE Transactions on Information Theory
Constructing free-energy approximations and generalized belief propagation algorithms
IEEE Transactions on Information Theory
MAP estimation via agreement on trees: message-passing and linear programming
IEEE Transactions on Information Theory
Collective Inference for Extraction MRFs Coupled with Symmetric Clique Potentials
The Journal of Machine Learning Research
Proceedings of the 2nd International Workshop on Big Data, Streams and Heterogeneous Source Mining: Algorithms, Systems, Programming Models and Applications
Energy distribution view for monotonic dual decomposition
International Journal of Approximate Reasoning
Variational algorithms for marginal MAP
The Journal of Machine Learning Research
Hi-index | 0.00 |
The problem of computing a maximum a posteriori (MAP) configuration is a central computational challenge associated with Markov random fields. There has been some focus on "tree-based" linear programming (LP) relaxations for the MAP problem. This paper develops a family of super-linearly convergent algorithms for solving these LPs, based on proximal minimization schemes using Bregman divergences. As with standard message-passing on graphs, the algorithms are distributed and exploit the underlying graphical structure, and so scale well to large problems. Our algorithms have a double-loop character, with the outer loop corresponding to the proximal sequence, and an inner loop of cyclic Bregman projections used to compute each proximal update. We establish convergence guarantees for our algorithms, and illustrate their performance via some simulations. We also develop two classes of rounding schemes, deterministic and randomized, for obtaining integral configurations from the LP solutions. Our deterministic rounding schemes use a "re-parameterization" property of our algorithms so that when the LP solution is integral, the MAP solution can be obtained even before the LP-solver converges to the optimum. We also propose graph-structured randomized rounding schemes applicable to iterative LP-solving algorithms in general. We analyze the performance of and report simulations comparing these rounding schemes.