Bringing together automatic differentiation and OpenMP
ICS '01 Proceedings of the 15th international conference on Supercomputing
Explicit Loop Scheduling in OpenMP for Parallel Automatic Differentiation
HPCS '02 Proceedings of the 16th Annual International Symposium on High Performance Computing Systems and Applications
A class of OpenMP applications involving nested parallelism
Proceedings of the 2004 ACM symposium on Applied computing
Principles of Concurrent and Distributed Programming (2nd Edition) (Prentice-Hall International Series in Computer Science)
Evaluating Derivatives: Principles and Techniques of Algorithmic Differentiation
Evaluating Derivatives: Principles and Techniques of Algorithmic Differentiation
A parallel levenberg-marquardt algorithm
Proceedings of the 23rd international conference on Supercomputing
The Art of Differentiating Computer Programs: An Introduction to Algorithmic Differentiation
The Art of Differentiating Computer Programs: An Introduction to Algorithmic Differentiation
Hi-index | 0.00 |
Least-squares problems occur often in practice, for example, when a parametrized model is used to describe a behavior of a chemical, physical or an economic application. In this paper, we describe a method for solving least-squares problems that are given as a large system of equations. The solution combines the commonly used methods with algorithmic differentiation and shared-memory multiprocessing. The system of equations contains model functions that are independent from each other. This independence enables the usage of a multiprocessing approach. With help of algorithmic differentiation by source transformation, we obtain the derivative code of the residual function. The advantage of using source transformation is that we can transform the OpenMP pragmas of the input code into corresponding pendants in the derivative code. This is, in particular in the adjoint case, not a straightforward approach. We show the scaling properties of the derivative code and of the optimization process.