A nonmonotone line search technique for Newton's method
SIAM Journal on Numerical Analysis
A Globally Convergent Successive Approximation Method for Severely Nonsmooth Equations
SIAM Journal on Control and Optimization
A class of smoothing functions for nonlinear and mixed complementarity problems
Computational Optimization and Applications
International Journal of Computer Vision
On Homotopy-Smoothing Methods for Box-Constrained Variational Inequalities
SIAM Journal on Control and Optimization
A Fast Algorithm for Deblurring Models with Neumann Boundary Conditions
SIAM Journal on Scientific Computing
A modified BFGS method and its global convergence in nonconvex minimization
Journal of Computational and Applied Mathematics - Special issue on nonlinear programming and variational inequalities
SIAM Journal on Numerical Analysis
A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
SIAM Journal on Optimization
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
Efficient Minimization Methods of Mixed l2-l1 and l1-l1 Norms for Image Restoration
SIAM Journal on Scientific Computing
Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
SIAM Journal on Optimization
SIAM Journal on Optimization
Efficient Reconstruction of Piecewise Constant Images Using Nonsmooth Nonconvex Minimization
SIAM Journal on Imaging Sciences
A New Alternating Minimization Algorithm for Total Variation Image Reconstruction
SIAM Journal on Imaging Sciences
Minimizing the Condition Number of a Gram Matrix
SIAM Journal on Optimization
A reweighted nuclear norm minimization algorithm for low rank matrix recovery
Journal of Computational and Applied Mathematics
Hi-index | 0.00 |
Image restoration problems are often converted into large-scale, nonsmooth, and nonconvex optimization problems. Most existing minimization methods are not efficient for solving such problems. It is well known that nonlinear conjugate gradient methods are preferred to solve large-scale smooth optimization problems due to their simplicity, low storage, practical computation efficiency, and nice convergence properties. In this paper, we propose a smoothing nonlinear conjugate gradient method where an intelligent scheme is used to update the smoothing parameter at each iteration and guarantees that any accumulation point of a sequence generated by this method is a Clarke stationary point of the nonsmooth and nonconvex optimization problem. Moreover, we present a class of smoothing functions and show their approximation properties. This method is easy to implement without adding any new variables. Three image restoration problems with different pixels and different regularization terms are used in numerical tests. Experimental results and comparison with the continuation method in [M. Nikolova et al., SIAM J. Imaging Sci., 1 (2008), pp. 2-25] show the efficiency of the proposed method.