On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
ACM Transactions on Mathematical Software (TOMS)
Convergence Properties of Nonlinear Conjugate Gradient Methods
SIAM Journal on Optimization
New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
Journal of Computational and Applied Mathematics
Scaled conjugate gradient algorithms for unconstrained optimization
Computational Optimization and Applications
Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
Optimization Methods & Software
Hi-index | 7.29 |
In this paper we propose a fundamentally different conjugate gradient method, in which the well-known parameter @b"k is computed by an approximation of the Hessian/vector product through finite differences. For search direction computation, the method uses a forward difference approximation to the Hessian/vector product in combination with a careful choice of the finite difference interval. For the step length computation we suggest an acceleration scheme able to improve the efficiency of the algorithm. Under common assumptions, the method is proved to be globally convergent. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with conjugate gradient algorithms including CONMIN by Shanno and Phua [D.F. Shanno, K.H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Trans. Math. Softw. 2 (1976) 87-94], SCALCG by Andrei [N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Comput. Optim. Appl. 38 (2007) 401-416; N. Andrei, Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Optim. Methods Softw. 22 (2007) 561-571; N. Andrei, A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Appl. Math. Lett. 20 (2007) 645-650], and new conjugacy condition and related new conjugate gradient by Li, Tang and Wei [G. Li, C. Tang, Z. Wei, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, J. Comput. Appl. Math. 202 (2007) 523-539] or truncated Newton TN by Nash [S.G. Nash, Preconditioning of truncated-Newton methods, SIAM J. on Scientific and Statistical Computing 6 (1985) 599-616] using a set of 750 unconstrained optimization test problems show that the suggested algorithm outperforms these conjugate gradient algorithms as well as TN.