Two new conjugate gradient methods based on modified secant equations

  • Authors:
  • Saman Babaie-Kafaki;Reza Ghanbari;Nezam Mahdavi-Amiri

  • Affiliations:
  • Faculty of Mathematical Sciences, Sharif University of Technology, Tehran, Iran;Department of Mathematics, Ferdowsi University of Mashhad, Mashhad, Iran;Faculty of Mathematical Sciences, Sharif University of Technology, Tehran, Iran

  • Venue:
  • Journal of Computational and Applied Mathematics
  • Year:
  • 2010

Quantified Score

Hi-index 7.29

Visualization

Abstract

Following the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradient methods for unconstrained optimization problems. One of our proposed methods is based on a modified version of the secant equation proposed by Zhang, Deng and Chen, and Zhang and Xu, and the other is based on the modified BFGS update proposed by Yuan. An interesting feature of our methods is their account of both the gradient and function values. Under proper conditions, we show that one of the proposed methods is globally convergent for general functions and that the other is globally convergent for uniformly convex functions. To enhance the performance of the line search procedure, we also propose a new approach for computing the initial steplength to be used for initiating the procedure. We provide a comparison of implementations of our methods with the efficient conjugate gradient methods proposed by Dai and Liao, and Hestenes and Stiefel. Numerical test results show the efficiency of our proposed methods.