Algorithm 755: ADOL-C: a package for the automatic differentiation of algorithms written in C/C++
ACM Transactions on Mathematical Software (TOMS)
Evaluating derivatives: principles and techniques of algorithmic differentiation
Evaluating derivatives: principles and techniques of algorithmic differentiation
Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)
Properties of an augmented Lagrangian for design optimization
Optimization Methods & Software
Adaptive sequencing of primal, dual, and design steps in simulation based optimization
Computational Optimization and Applications
Hi-index | 0.00 |
We consider the task of design optimization where the constraint is a state equation that can only be solved by a typically rather slowly converging fixed point solver. This process can be augmented by a corresponding adjoint solver and based on the resulting approximate reduced derivatives also an optimization iteration which actually changes the design. To coordinate the three iterative processes, we use an exact penalty function of doubly augmented Lagrangian type. The main issue here is how to derive a design space preconditioner for the approximated reduced gradient which ensures a consistent reduction of the employed penalty function as well as significant design corrections. Some numerical experiments for an alternating approach where any combination and sequencing of steps are used to improve feasibility and optimality done on a variant of the Bratu problem are presented.