

Semiautomatic Differentiation for Efficient Gradient Computations
incollection
  

Author(s)
David M. Gay

Published in Automatic Differentiation: Applications, Theory, and Implementations

Editor(s) H. M. Bücker, G. Corliss, P. Hovland, U. Naumann, B. Norris 
Year 2005 
Publisher Springer 
Abstract Many largescale computations involve a mesh and first (or sometimes higher) partial derivatives of functions of mesh elements. In principle, automatic differentiation (ad) can provide the requisite partials more efficiently and accurately than conventional finitedifference approximations. ad requires sourcecode modifications, which may be little more than changes to declarations. Such simple changes can easily give improved results, e.g., when Jacobianvector products are used iteratively to solve nonlinear equations. When gradients are required (say, for optimization) and the problem involves many variables, ``backward ad″ in theory is very efficient, but when carried out automatically and straightforwardly, may use a prohibitive amount of memory. In this case, applying ad separately to each element function and manually assembling the gradient pieces  semiautomatic differentiation  can deliver gradients efficiently and accurately. This paper concerns ongoing work; it compares several implementations of backward ad, describes a simple operatoroverloading implementation specialized for gradient computations, and compares the implementations on some meshoptimization examples. Ideas from the specialized implementation could be used in fully general sourcetosource translators for C and C++. 
CrossReferences Bucker2005ADA 
AD Tools Rad 
BibTeX
@INCOLLECTION{
Gay2005SDf,
author = "David M. Gay",
title = "Semiautomatic Differentiation for Efficient Gradient Computations",
editor = "H. M. B{\"u}cker and G. Corliss and P. Hovland and U. Naumann and B.
Norris",
booktitle = "Automatic Differentiation: {A}pplications, Theory, and Implementations",
series = "Lecture Notes in Computational Science and Engineering",
publisher = "Springer",
year = "2005",
abstract = "Many largescale computations involve a mesh and first (or sometimes higher)
partial derivatives of functions of mesh elements. In principle, automatic differentiation (AD) can
provide the requisite partials more efficiently and accurately than conventional finitedifference
approximations. AD requires sourcecode modifications, which may be little more than changes to
declarations. Such simple changes can easily give improved results, e.g., when Jacobianvector
products are used iteratively to solve nonlinear equations. When gradients are required (say, for
optimization) and the problem involves many variables, ``backward AD'' in theory is very
efficient, but when carried out automatically and straightforwardly, may use a prohibitive amount of
memory. In this case, applying AD separately to each element function and manually assembling the
gradient pieces  semiautomatic differentiation  can deliver gradients efficiently and
accurately. This paper concerns ongoing work; it compares several implementations of backward AD,
describes a simple operatoroverloading implementation specialized for gradient computations, and
compares the implementations on some meshoptimization examples. Ideas from the specialized
implementation could be used in fully general sourcetosource translators for C and C++.",
crossref = "Bucker2005ADA",
ad_tools = "Rad",
pages = "147158",
doi = "10.1007/3540284389_13"
}
 
back

