CALL FOR PAPERS
Derivatives, also known as sensitivities, are ubiquitous in various areas of computational science and their accurate and efficient evaluation is often indispensable for a wide variety of numerical algorithms. Automatic Differentiation (AD) is a technology for automatically augmenting computer programs, including arbitrarily complex simulations, with statements for the computation of derivatives. In contrast to traditional numerical differentiation based on divided differences, AD provides guaranteed accuracy, ease of use, and computational efficiency. Sophisticated software tools implementing the AD technology rely on advanced compiler or operator overloading techniques to generate efficient code for the derivative computations. Moreover, computational efficiency can often be substantially increased by exploiting mathematical insights about the underlying al gorithms or the dataflow structure of the underlying code.
The primary goal of this session is to provide a venue for computational scientists to report on recent advances in tool design as well as innovative uses of AD tools and AD-related methodologies.
Authors are invited to submit original and unpublished papers in all areas of AD with an emphasis on successful applications. Papers reporting a particular strength of an AD tool or a particular methodology going beyond the black-box AD approach are also welcome. Particular emphasis will be on applications where the use of AD has led to a qualitative difference with respect to numerical robustness or computational feasibility of algorithms relying on derivatives.