Fast Approximate Energy Minimization via Graph Cuts
IEEE Transactions on Pattern Analysis and Machine Intelligence
Handbook of Mathematical Models in Computer Vision
Handbook of Mathematical Models in Computer Vision
Convex Multi-class Image Labeling by Simplex-Constrained Total Variation
SSVM '09 Proceedings of the Second International Conference on Scale Space and Variational Methods in Computer Vision
Split Bregman Algorithm, Douglas-Rachford Splitting and Frame Shrinkage
SSVM '09 Proceedings of the Second International Conference on Scale Space and Variational Methods in Computer Vision
Exact optimization for Markov random fields with convex priors
IEEE Transactions on Pattern Analysis and Machine Intelligence
Global Minimization for Continuous Multiphase Partitioning Problems Using a Dual Approach
International Journal of Computer Vision
EMMCVPR'11 Proceedings of the 8th international conference on Energy minimization methods in computer vision and pattern recognition
Fast finsler active contours and shape prior descriptor
CIARP'11 Proceedings of the 16th Iberoamerican Congress conference on Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
Segmentation of images with separating layers by fuzzy c-means and convex optimization
Journal of Visual Communication and Image Representation
Continuous Multiclass Labeling Approaches and Algorithms
SIAM Journal on Imaging Sciences
Nonmetric priors for continuous multilabel optimization
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part VII
Hi-index | 0.00 |
The saddle point framework provides a convenient way to formulate many convex variational problems that occur in computer vision. The framework unifies a broad range of data and regularization terms, and is particularly suited for nonsmooth problems such as Total Variation-based approaches to image labeling. However, for many interesting problems the constraint sets involved are difficult to handle numerically. State-of-the-art methods rely on using nested iterative projections, which induces both theoretical and practical convergence issues. We present a dual multiple-constraint Douglas-Rachford splitting approach that is globally convergent, avoids inner iterative loops, enforces the constraints exactly, and requires only basic operations that can be easily parallelized. The method outperforms existing methods by a factor of 4-20 while considerably increasing the numerical robustness.