Third Workshop on
Automatic Differentiation: Tools and Applications
held within ICCS 2006, Reading, UK, May 28-31, 2006
Automatic differentiation (AD) is a methodology for computing derivatives from computer codes, ensuring accuracy and allowing for efficient computation regimes through judicious use of the chain rule of differential calculus. We solicit papers describing advances both in the algorithmic underpinnings and implementation of AD tools as well as applications of such tools to advanced computational problems. Possible Topics include: Improved algorithms for computing Jacobians, Hessians, and Gradients; AD tools for advanced programming languages such as Matlab or domain-specific languages; Application studies of the use of AD tools and methodologies in sensitivity analysis, uncertainty, optimization, parameter identification or experimental design.
Many methods for sensitivity analysis, uncertainty analysis, optimization, parameter identification and experimental design require the computation of derivatives df/dx from computer programs expressed in high-level languages as well as in thousands of lines of Fortran. The computational complexity, both in flops and in terms of memory, of computing such derivatives, may well determine the feasibility of such a computation, and the numerical accuracy of such derivatives may greatly impact the robustness of the method that employs such derivatives. Thus, AD can be viewed as an enabling technology for many such applications.