Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization

  • Authors:
  • Yasushi Narushima;Hiroshi Yabe

  • Affiliations:
  • Department of Management System Science, Yokohama National University, 79-4 Tokiwadai, Hodogaya-ku, Yokohama 240-8501, Japan;Department of Mathematical Information Science, Tokyo University of Science, 1-3, Kagurazaka, Shinjuku-ku, Tokyo 162-8601, Japan

  • Venue:
  • Journal of Computational and Applied Mathematics
  • Year:
  • 2012

Quantified Score

Hi-index 7.29

Visualization

Abstract

Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a descent search direction. On the other hand, Hager and Zhang (2005) proposed another conjugate gradient method which always generates a descent search direction. In this paper, combining Dai-Liao's idea and Hager-Zhang's idea, we propose conjugate gradient methods based on secant conditions that generate descent search directions. In addition, we prove global convergence properties of the proposed methods. Finally, preliminary numerical results are given.