Some numerical experiments with variable-storage quasi-Newton algorithms
Mathematical Programming: Series A and B
On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
Remark on Algorithm 702—the updated truncated Newton minimization package
ACM Transactions on Mathematical Software (TOMS)
Algorithm 809: PREQN: Fortran 77 subroutines for preconditioning the conjugate gradient method
ACM Transactions on Mathematical Software (TOMS)
Lancelot: A FORTRAN Package for Large-Scale Nonlinear Optimization (Release A)
Lancelot: A FORTRAN Package for Large-Scale Nonlinear Optimization (Release A)
Automatic Preconditioning by Limited Memory Quasi-Newton Updating
SIAM Journal on Optimization
Newton's Method for Large Bound-Constrained Optimization Problems
SIAM Journal on Optimization
Efficient Implementation of the Truncated-Newton Algorithm for Large-Scale Chemistry Applications
SIAM Journal on Optimization
Discrete second order adjoints in atmospheric chemical transport modeling
Journal of Computational Physics
Data Assimilation in Multiscale Chemical Transport Models
ICCS '07 Proceedings of the 7th international conference on Computational Science, Part I: ICCS 2007
Optimization Methods & Software
SpringSim '10 Proceedings of the 2010 Spring Simulation Multiconference
A Line Search Multigrid Method for Large-Scale Nonlinear Optimization
SIAM Journal on Optimization
Hi-index | 0.00 |
This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method (L-BFGS) and a Hessian-free Newton method (HFN) in such a way that the information collected by one type of iteration improves the performance of the other. Curvature information about the objective function is stored in the form of a limited memory matrix, and plays the dual role of preconditioning the inner conjugate gradient iteration in the HFN method and of providing an initial matrix for L-BFGS iterations. The lengths of the L-BFGS and HFN cycles are adjusted dynamically during the course of the optimization. Numerical experiments indicate that the new algorithms are both effective and not sensitive to the choice of parameters.