Introduction to algorithms
Markov random field models in computer vision
ECCV '94 Proceedings of the third European conference on Computer Vision (Vol. II)
Fast Approximate Energy Minimization via Graph Cuts
IEEE Transactions on Pattern Analysis and Machine Intelligence
Mean Shift: A Robust Approach Toward Feature Space Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to Linear Optimization
Introduction to Linear Optimization
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Learning a Classification Model for Segmentation
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
What Energy Functions Can Be Minimizedvia Graph Cuts?
IEEE Transactions on Pattern Analysis and Machine Intelligence
Efficient Graph-Based Image Segmentation
International Journal of Computer Vision
An Experimental Comparison of Min-Cut/Max-Flow Algorithms for Energy Minimization in Vision
IEEE Transactions on Pattern Analysis and Machine Intelligence
Large Margin Methods for Structured and Interdependent Output Variables
The Journal of Machine Learning Research
Guiding Model Search Using Segmentation
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Convergent Tree-Reweighted Message Passing for Energy Minimization
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Linear Programming Relaxations and Belief Propagation -- An Empirical Study
The Journal of Machine Learning Research
A Linear Programming Approach to Max-Sum Problem: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Speeded-Up Robust Features (SURF)
Computer Vision and Image Understanding
Training structural SVMs when exact inference is intractable
Proceedings of the 25th international conference on Machine learning
A dual coordinate descent method for large-scale linear SVM
Proceedings of the 25th international conference on Machine learning
On partial optimality in multi-label MRFs
Proceedings of the 25th international conference on Machine learning
Efficiently solving convex relaxations for MAP estimation
Proceedings of the 25th international conference on Machine learning
Topology cuts: A novel min-cut/max-flow algorithm for topology preserving segmentation in N-D images
Computer Vision and Image Understanding
Semiautomatic segmentation with compact shape prior
Image and Vision Computing
Beyond Loose LP-Relaxations: Optimizing MRFs by Repairing Cycles
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part III
Learning to Localize Objects with Structured Output Regression
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part I
Learning CRFs Using Graph Cuts
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part II
Cutting-plane training of structural SVMs
Machine Learning
Convergent message passing algorithms: a unifying view
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Probabilistic Graphical Models: Principles and Techniques - Adaptive Computation and Machine Learning
MAP estimation via agreement on trees: message-passing and linear programming
IEEE Transactions on Information Theory
Discrete Applied Mathematics
Tighter relaxations for higher-order models based on generalized roof duality
ECCV'12 Proceedings of the 12th international conference on Computer Vision - Volume Part III
Hi-index | 0.00 |
Markov random field (MRF) models, including conditional random field models, are popular in computer vision. However, in order to be computationally tractable, they are limited to incorporating only local interactions and cannot model global properties such as connectedness, which is a potentially useful high-level prior for object segmentation. In this work, we overcome this limitation by deriving a potential function that forces the output labeling to be connected and that can naturally be used in the framework of recent maximum a posteriori (MAP)-MRF linear program (LP) relaxations. Using techniques from polyhedral combinatorics, we show that a provably strong approximation to the MAP solution of the resulting MRF can still be found efficiently by solving a sequence of max-flow problems. The efficiency of the inference procedure also allows us to learn the parameters of an MRF with global connectivity potentials by means of a cutting plane algorithm. We experimentally evaluate our algorithm on both synthetic data and on the challenging image segmentation task of the PASCAL Visual Object Classes 2008 data set. We show that in both cases the addition of a connectedness prior significantly reduces the segmentation error.