ADMIT-1: automatic differentiation and MATLAB interface toolbox
ACM Transactions on Mathematical Software (TOMS)
Sensitivity analysis of shape memory alloy shells
Computers and Structures
A secant method for nonlinear least-squares minimization
Computational Optimization and Applications
Hi-index | 0.00 |
The advent of robust automatic differentiation tools is an exciting and important development in scientific computing. It is particularly noteworthy that the gradient of a scalar-valued function of many variables can be computed with essentially the same time complexity as required to evaluate the function itself. This is true, in theory, when the "reverse mode" of automatic differentiation is used (whereas the "forward mode" introduces an additional factor corresponding to the problem dimension). However, in practice, performance on large problems can be significantly (and unacceptably) worse than predicted. In this paper we illustrate that when natural structure is exploited, fast gradient computation can be recovered, even for large dimensional problems.