Efficient generalized conjugate gradient algorithms, Part 1: theory
Journal of Optimization Theory and Applications
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
CUTEr and SifDec: A constrained and unconstrained testing environment, revisited
ACM Transactions on Mathematical Software (TOMS)
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent
ACM Transactions on Mathematical Software (TOMS)
Scaled conjugate gradient algorithms for unconstrained optimization
Computational Optimization and Applications
Applying Powell's symmetrical technique to conjugate gradient methods
Computational Optimization and Applications
Hi-index | 0.00 |
A family of new conjugate gradient methods is proposed based on Perry's idea, which satisfies the descent property or the sufficient descent property for any line search. In addition, based on the scaling technology and the restarting strategy, a family of scaling symmetric Perry conjugate gradient methods with restarting procedures is presented. The memoryless BFGS method and the SCALCG method are the special forms of the two families of new methods, respectively. Moreover, several concrete new algorithms are suggested. Under Wolfe line searches, the global convergence of the two families of the new methods is proven by the spectral analysis for uniformly convex functions and nonconvex functions. The preliminary numerical comparisons with CG_DESCENT and SCALCG algorithms show that these new algorithms are very effective algorithms for the large-scale unconstrained optimization problems. Finally, a remark for further research is suggested.