On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
Line search algorithms with guaranteed sufficient decrease
ACM Transactions on Mathematical Software (TOMS)
CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
Rank-deficient and discrete ill-posed problems: numerical aspects of linear inversion
Rank-deficient and discrete ill-posed problems: numerical aspects of linear inversion
Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
ACM Transactions on Mathematical Software (TOMS)
Convergence Properties of Nonlinear Conjugate Gradient Methods
SIAM Journal on Optimization
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
Optimization Methods & Software
Hi-index | 7.29 |
New accelerated nonlinear conjugate gradient algorithms which are mainly modifications of Dai and Yuan's for unconstrained optimization are proposed. Using the exact line search, the algorithm reduces to the Dai and Yuan conjugate gradient computational scheme. For inexact line search the algorithm satisfies the sufficient descent condition. Since the step lengths in conjugate gradient algorithms may differ from 1 by two orders of magnitude and tend to vary in a very unpredictable manner, the algorithms are equipped with an acceleration scheme able to improve the efficiency of the algorithms. Computational results for a set consisting of 750 unconstrained optimization test problems show that these new conjugate gradient algorithms substantially outperform the Dai-Yuan conjugate gradient algorithm and its hybrid variants, Hestenes-Stiefel, Polak-Ribiere-Polyak, CONMIN conjugate gradient algorithms, limited quasi-Newton algorithm LBFGS and compare favorably with CG_DESCENT. In the frame of this numerical study the accelerated scaled memoryless BFGS preconditioned conjugate gradient ASCALCG algorithm proved to be more robust.