Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Efficient hybrid conjugate gradient techniques
Journal of Optimization Theory and Applications
Efficient generalized conjugate gradient algorithms, Part 1: theory
Journal of Optimization Theory and Applications
Global convergence result for conjugate gradient methods
Journal of Optimization Theory and Applications
CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
ACM Transactions on Mathematical Software (TOMS)
Convergence Properties of Nonlinear Conjugate Gradient Methods
SIAM Journal on Optimization
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
Scaled conjugate gradient algorithms for unconstrained optimization
Computational Optimization and Applications
Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
Optimization Methods & Software
Optimization Methods & Software - Dedicated to Professor Michael J.D. Powell on the occasion of his 70th birthday
Hi-index | 0.00 |
A nonlinear conjugate gradient algorithm which is a modification of the Dai and Yuan [A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim. 10 (1999), pp. 177-182] conjugate gradient algorithm satisfying a parameterized sufficient descent condition with a parameter δk is proposed. The parameter δk is computed by means of the conjugacy condition, thus an algorithm which is a positive multiplicative modification of the Hestenes and Stiefel [Methods of conjugate gradients for solving linear systems, J. Res. Nat. Bur. Standards Sec. B 48 (1952), pp. 409-436] algorithm is obtained. The algorithm can be viewed as an adaptive version of the Dai and Liao [New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim. 43 (2001), pp. 87-101] conjugate gradient algorithm. Close to our computational scheme is the conjugate gradient algorithm recently proposed by Hager and Zhang [A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim. 16 (2005), pp. 170-192]. Computational results, for a set consisting of 750 unconstrained optimization test problems, show that this new conjugate gradient algorithm substantially outperforms the known conjugate gradient algorithms.