Nonlinear total variation based noise removal algorithms
Proceedings of the eleventh annual international conference of the Center for Nonlinear Studies on Experimental mathematics : computational issues in nonlinear science: computational issues in nonlinear science
Algorithm 813: SPG—Software for Convex-Constrained Optimization
ACM Transactions on Mathematical Software (TOMS)
Nonmonotone Spectral Projected Gradient Methods on Convex Sets
SIAM Journal on Optimization
Richardson''s non-stationary matrix iterative procedure.
Richardson''s non-stationary matrix iterative procedure.
Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
Numerische Mathematik
Smooth minimization of non-smooth functions
Mathematical Programming: Series A and B
Gradient Methods with Adaptive Step-Sizes
Computational Optimization and Applications
A New Active Set Algorithm for Box Constrained Optimization
SIAM Journal on Optimization
Fast numerical algorithms for total variation based image restoration
Fast numerical algorithms for total variation based image restoration
Probing the Pareto Frontier for Basis Pursuit Solutions
SIAM Journal on Scientific Computing
IEEE Transactions on Image Processing
From box filtering to fast explicit diffusion
Proceedings of the 32nd DAGM conference on Pattern recognition
SIAM Journal on Scientific Computing
A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
Journal of Mathematical Imaging and Vision
Convex Analysis and Monotone Operator Theory in Hilbert Spaces
Convex Analysis and Monotone Operator Theory in Hilbert Spaces
Total variation minimization and a class of binary MRF models
EMMCVPR'05 Proceedings of the 5th international conference on Energy Minimization Methods in Computer Vision and Pattern Recognition
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Hi-index | 0.00 |
In recent years, convex optimization methods were successfully applied for various image processing tasks and a large number of first-order methods were designed to minimize the corresponding functionals. Interestingly, it was shown recently in Grewenig et al. (2010) that the simple idea of so-called "superstep cycles" leads to very efficient schemes for time-dependent (parabolic) image enhancement problems as well as for steady state (elliptic) image compression tasks. The "superstep cycles" approach is similar to the nonstationary (cyclic) Richardson method which has been around for over sixty years.In this paper, we investigate the incorporation of superstep cycles into the projected gradient method. We show for two problems in compressive sensing and image processing, namely the LASSO approach and the Rudin-Osher-Fatemi model that the resulting simple cyclic projected gradient algorithm can numerically compare with various state-of-the-art first-order algorithms. However, due to the nonlinear projection within the algorithm convergence proofs even under restrictive assumptions on the linear operators appear to be hard. We demonstrate the difficulties by studying the simplest case of a two-cycle algorithm in 驴2 with projections onto the Euclidean ball.