Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Efficient hybrid conjugate gradient techniques
Journal of Optimization Theory and Applications
Efficient generalized conjugate gradient algorithms, Part 1: theory
Journal of Optimization Theory and Applications
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
A three-parameter family of nonlinear conjugate gradient methods
Mathematics of Computation
Convergence Properties of Nonlinear Conjugate Gradient Methods
SIAM Journal on Optimization
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent
ACM Transactions on Mathematical Software (TOMS)
A new family of conjugate gradient methods
Journal of Computational and Applied Mathematics
A comparison of acceleration techniques for nonrigid medical image registration
WBIR'06 Proceedings of the Third international conference on Biomedical Image Registration
Hi-index | 0.00 |
Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. This paper proposes a three-parameter family of hybrid conjugate gradient methods. Two important features of the family are that (i) it can avoid the propensity of small steps, namely, if a small step is generated away from the solution point, the next search direction will be close to the negative gradient direction; and (ii) its descent property and global convergence are likely to be achieved provided that the line search satisfies the Wolfe conditions. Some numerical results with the family are also presented.