On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
TNPACK—A truncated Newton minimization package for large-scale problems: I. Algorithm and usage
ACM Transactions on Mathematical Software (TOMS)
TNPACK—a truncated Newton minimization package for large-scale problems: II. Implementation examples
ACM Transactions on Mathematical Software (TOMS)
Computational Optimization and Applications
Matrix computations (3rd ed.)
Remark on “Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]”
ACM Transactions on Mathematical Software (TOMS)
Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)
Nonlinear optimization and parallel computing
Parallel Computing - Special issue: Parallel computing in numerical optimization
Discrete second order adjoints in atmospheric chemical transport modeling
Journal of Computational Physics
SpringSim '10 Proceedings of the 2010 Spring Simulation Multiconference
Hi-index | 0.00 |
A new algorithm is presented for carrying out large-scaleunconstrained optimization required in variational data assimilationusing the Newton method. The algorithm is referred to as the adjointNewton algorithm. The adjoint Newton algorithm is based on thefirst- and second-order adjoint techniques allowing us to obtain theNewton line search direction by integrating a tangent linearequations model backwards in time (starting from a final conditionwith negative time steps). The error present in approximating theHessian (the matrix of second-order derivatives) of the cost functionwith respect to the control variables in the quasi-Newton typealgorithm is thus completely eliminated, while the storage problemrelated to the Hessian no longer exists since the explicit Hessian isnot required in this algorithm. The adjoint Newton algorithm isapplied to three one-dimensional models and to a two-dimensionallimited-area shallow water equations model with both model generatedand First Global Geophysical Experiment data. We compare theperformance of the adjoint Newton algorithm with that of truncatedNewton, adjoint truncated Newton, and LBFGS methods. Our numericaltests indicate that the adjoint Newton algorithm is very efficientand could find the minima within three or four iterations forproblems tested here. In the case of the two-dimensional shallowwater equations model, the adjoint Newton algorithm improves upon theefficiencies of the truncated Newton and LBFGS methods by a factor ofat least 14 in terms of the CPU time required to satisfy the sameconvergence criterion.The Newton, truncated Newton and LBFGS methods are general purposeunconstrained minimization methods. The adjoint Newton algorithm isonly useful for optimal control problems where the model equationsserve as strong constraints and their corresponding tangent linearmodel may be integrated backwards in time. When the backwardsintegration of the tangent linear model is ill-posed in the sense ofHadamard, the adjoint Newton algorithm may not work. Thus, theadjoint Newton algorithm must be used with some caution. A possiblesolution to avoid the current weakness of the adjoint Newtonalgorithm is proposed.