A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search

  • Authors:
  • William W. Hager;Hongchao Zhang

  • Affiliations:
  • -;-

  • Venue:
  • SIAM Journal on Optimization
  • Year:
  • 2005

Quantified Score

Hi-index 0.02

Visualization

Abstract

A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes--Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme satisfies the descent condition ${\bf g}_k^{\sf T} {\bf d}_k \le -\frac{7}{8} \|{\bf g}_k\|^2$. Moreover, a global convergence result is established when the line search fulfills the Wolfe conditions. A new line search scheme is developed that is efficient and highly accurate. Efficiency is achieved by exploiting properties of linear interpolants in a neighborhood of a local minimizer. High accuracy is achieved by using a convergence criterion, which we call the ``approximate Wolfe'' conditions, obtained by replacing the sufficient decrease criterion in the Wolfe conditions with an approximation that can be evaluated with greater precision in a neighborhood of a local minimum than the usual sufficient decrease criterion. Numerical comparisons are given with both L-BFGS and conjugate gradient methods using the unconstrained optimization problems in the CUTE library.