Another nonlinear conjugate gradient algorithm for unconstrained optimization

  • Authors:
  • Neculai Andrei

  • Affiliations:
  • Research Institute for Informatics, Centre for Advanced Modelling and Optimization, Bucharest, Romania

  • Venue:
  • Optimization Methods & Software
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A nonlinear conjugate gradient algorithm which is a modification of the Dai and Yuan [A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim. 10 (1999), pp. 177-182] conjugate gradient algorithm satisfying a parameterized sufficient descent condition with a parameter δk is proposed. The parameter δk is computed by means of the conjugacy condition, thus an algorithm which is a positive multiplicative modification of the Hestenes and Stiefel [Methods of conjugate gradients for solving linear systems, J. Res. Nat. Bur. Standards Sec. B 48 (1952), pp. 409-436] algorithm is obtained. The algorithm can be viewed as an adaptive version of the Dai and Liao [New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim. 43 (2001), pp. 87-101] conjugate gradient algorithm. Close to our computational scheme is the conjugate gradient algorithm recently proposed by Hager and Zhang [A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim. 16 (2005), pp. 170-192]. Computational results, for a set consisting of 750 unconstrained optimization test problems, show that this new conjugate gradient algorithm substantially outperforms the known conjugate gradient algorithms.