Frequently Asked Questions
Introduction
   FAQ
   Surveys
   Selected Books
Applications
Tools
Research Groups
Workshops
Publications
My Account
About

Questions:

  1. What is Automatic Differentiation (AD)?
  2. How does AD compare with ... ?
  3. What is an AD tool?
  4. What languages does AD work with?
  5. How do I find a suitable AD tool?
  6. How big a function can one differentiate using AD?
  7. What is the difference between the forward mode and the reverse mode?
  8. What about operator overloading vs source-to-source transformation?

Answers:

  1. What is Automatic Differentiation (AD)?

    Automatic Differentiation, also called Algorithmic Differentiation, is a technology for automatically augmenting computer programs, including arbitrarily complex simulations, with statements for the computation of derivatives, also known as sensitivities.

  2. How does AD compare with ... ?

    • numerical differentiation?
      Divided differencing (DD) is based on some truncation of the Taylor series. It is easy to implement by evaluating the underlying function using perturbations of the input parameters. However, a suitable perturbation is often hard to find because a small perturbation decreasing the truncation error will increase the cancellation error.
    • symbolic differentiation?
      Computer algebra packages manipulate expressions by repeatedly applying the chain rule so that there is no truncation error. However, the resulting expression for the derivative involves the parameters with respect to which one is differentiating. This can lead to an excessive growth of the length of the expression.
    • differentiation by hand?
      Manual implementation of analytic derivative formulae typically results in very efficient derivative code. However, the implementation is tedious and error-prone.
    More details are given in a survey paper.

  3. What is an AD tool?

    An AD tool is a software implementing the AD technology. See the current list at the Tools section.

  4. What languages does AD work with?

    In principle, any language.

  5. How do I find a suitable AD tool?

    A (still incomplete) list of AD tools is currently compiled in the Tools section. Choose a language to get a subset of relevant AD tools.

  6. How big a function can one differentiate using AD?

    In principle, arbitrarily complex functions can be differentiated. The largest application to date is a 1.6 million line FEM code written in Fortran 77. See this article for more details.

  7. What is the difference between the forward mode and the reverse mode?

    The derivatives of dependent with respect to independent variables computed by these nodes are mathematically equivalent, but the time and memory requirements of computing them may differ.

    The forward mode computes derivatives of intermediate variables with respect to independent variables and propagates from one statement to the next according to the chain rule. The reverse mode computes derivatives of dependent variables with respect to intermediate variables and propagates from one statement to the previous statement according according to the chain rule. Thus, the reverse mode requires a reversal of the program execution.

    The forward mode is sometimes referred to as direct or tangent linear mode whereas the reverse mode is also called backward, adjoint or cotangent linear mode.

  8. What about operator overloading vs source-to-source transformation?

    Implementations of AD can be broadly classified into two categories. AD tools based on operator overloading exploit the fact that modern programming languages offer the possibility to redefine the semantics of elemantary operators. AD tools based on source-to-source transformation change the semantics by explicitly rewriting the code. Each of these approaches has its advantages and disadvantages. See the following paper for details. Some remarks concerning further implementation techniques are sketched in Section 3 of this paper

  

LinkedIn:    Contact:
autodiff.org
Username:
Password:
(lost password)