A nonmonotone line search technique for Newton's method
SIAM Journal on Numerical Analysis
A truncated Newton method with nonmonotone line search for unconstrained optimization
Journal of Optimization Theory and Applications
Efficient generalized conjugate gradient algorithms, Part 1: theory
Journal of Optimization Theory and Applications
Global convergence result for conjugate gradient methods
Journal of Optimization Theory and Applications
CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
An assessment of nonmonotone linesearch techniques for unconstrained optimization
SIAM Journal on Scientific Computing
Non-monotone trust-region algorithms for nonlinear optimization subject to convex constraints
Mathematical Programming: Series A and B
A globally convergent version of the Polak-Ribière conjugate gradient method
Mathematical Programming: Series A and B
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
SIAM Journal on Optimization
Convergence Properties of Nonlinear Conjugate Gradient Methods
SIAM Journal on Optimization
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
On the nonmonotone line search
Journal of Optimization Theory and Applications
Nonmonotone Globalization Techniques for the Barzilai-Borwein Gradient Method
Computational Optimization and Applications
Global convergence of a two-parameter family of conjugate gradient methods without line search
Journal of Computational and Applied Mathematics - Special issue: Papers presented at the 1st Sino--Japan optimization meeting, 26-28 October 2000, Hong Kong, China
Global convergence of nonmonotone descent methods for unconstrained optimization problems
Journal of Computational and Applied Mathematics - Special issue: Papers presented at the 1st Sino--Japan optimization meeting, 26-28 October 2000, Hong Kong, China
A family of hybrid conjugate gradient methods for unconstrained optimization
Mathematics of Computation
A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
SIAM Journal on Optimization
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent
ACM Transactions on Mathematical Software (TOMS)
Convergence of nonmonotone line search method
Journal of Computational and Applied Mathematics
Journal of Computational and Applied Mathematics
Hi-index | 7.29 |
In this paper we develop a new class of conjugate gradient methods for unconstrained optimization problems. A new nonmonotone line search technique is proposed to guarantee the global convergence of these conjugate gradient methods under some mild conditions. In particular, Polak-Ribiere-Polyak and Liu-Storey conjugate gradient methods are special cases of the new class of conjugate gradient methods. By estimating the local Lipschitz constant of the derivative of objective functions, we can find an adequate step size and substantially decrease the function evaluations at each iteration. Numerical results show that these new conjugate gradient methods are effective in minimizing large-scale non-convex non-quadratic functions.