Global convergence of a two-parameter family of conjugate gradient methods without line search
Journal of Computational and Applied Mathematics - Special issue: Papers presented at the 1st Sino--Japan optimization meeting, 26-28 October 2000, Hong Kong, China
A family of hybrid conjugate gradient methods for unconstrained optimization
Mathematics of Computation
A gradient-related algorithm with inexact line searches
Journal of Computational and Applied Mathematics
Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent
ACM Transactions on Mathematical Software (TOMS)
Memory gradient method with Goldstein line search
Computers & Mathematics with Applications
Journal of Dynamical and Control Systems
Convergence of supermemory gradient method
Journal of Applied Mathematics and Computing
Formulation and numerical solution of finite-level quantum optimal control problems
Journal of Computational and Applied Mathematics
Two descent hybrid conjugate gradient methods for optimization
Journal of Computational and Applied Mathematics
Some descent three-term conjugate gradient methods and their global convergence
Optimization Methods & Software
Intensity modulated radiotherapy treatment planning by use of a barrier-penalty multiplier method
Optimization Methods & Software
Optimization Methods & Software
Hybrid conjugate gradient methods for unconstrained optimization
Optimization Methods & Software
A new class of projection and contraction methods for solving variational inequality problems
Computers & Mathematics with Applications
Optimization Methods & Software - Dedicated to Professor Michael J.D. Powell on the occasion of his 70th birthday
A new family of conjugate gradient methods
Journal of Computational and Applied Mathematics
Another nonlinear conjugate gradient algorithm for unconstrained optimization
Optimization Methods & Software
NLPToolbox: an open-source nonlinear programming tool
Proceedings of the 2nd International Conference on Simulation Tools and Techniques
A conjugate gradient method with descent direction for unconstrained optimization
Journal of Computational and Applied Mathematics
Journal of Computational and Applied Mathematics
Two new conjugate gradient methods based on modified secant equations
Journal of Computational and Applied Mathematics
Journal of Computational and Applied Mathematics
New step lengths in conjugate gradient methods
Computers & Mathematics with Applications
CAR'10 Proceedings of the 2nd international Asia conference on Informatics in control, automation and robotics - Volume 3
A modified CG-DESCENT method for unconstrained optimization
Journal of Computational and Applied Mathematics
Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
Numerical Algorithms
SIAM Journal on Optimization
Journal of Computational and Applied Mathematics
Computational Optimization and Applications
Journal of Computational and Applied Mathematics
A Fokker-Planck control framework for multidimensional stochastic processes
Journal of Computational and Applied Mathematics
A semi-Lagrangian level set method for structural optimization
Structural and Multidisciplinary Optimization
A simple three-term conjugate gradient algorithm for unconstrained optimization
Journal of Computational and Applied Mathematics
A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
Journal of Computational and Applied Mathematics
A nonmonotone approximate sequence algorithm for unconstrained nonlinear optimization
Computational Optimization and Applications
Symmetric Perry conjugate gradient method
Computational Optimization and Applications
Hi-index | 0.01 |
Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. The strong Wolfe conditions are usually used in the analyses and implementations of conjugate gradient methods. This paper presents a new version of the conjugate gradient method, which converges globally, provided the line search satisfies the standard Wolfe conditions. The conditions on the objective function are also weak, being similar to those required by the Zoutendijk condition.