On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
A multigrid tutorial (2nd ed.)
A multigrid tutorial (2nd ed.)
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)
Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent
ACM Transactions on Mathematical Software (TOMS)
Recursive Trust-Region Methods for Multiscale Nonlinear Optimization
SIAM Journal on Optimization
Optimization Methods & Software - The 2nd Veszprem Optimization Conference: Advanced Algorithms (VOCAL), 13-15 December 2006, Veszprem, Hungary
Approximate invariant subspaces and quasi-newton optimization methods
Optimization Methods & Software
Hi-index | 0.00 |
The properties of multilevel optimization problems defined on a hierarchy of discretization grids can be used to define approximate secant equations, which describe the second-order behavior of the objective function. Following earlier work by Gratton and Toint (2009) we introduce a quasi-Newton method (with a linesearch) and a nonlinear conjugate gradient method that both take advantage of this new second-order information. We then present numerical experiments with these methods and formulate recommendations for their practical use.