

A Low Rank Approach to Automatic Differentiation
incollection
  

Author(s)
Hany S. AbdelKhalik
, Paul D. Hovland
, Andrew Lyons
, Tracy E. Stover
, Jean Utke

Published in Advances in Automatic Differentiation

Editor(s) Christian H. Bischof, H. Martin Bücker, Paul D. Hovland, Uwe Naumann, J. Utke 
Year 2008 
Publisher Springer 
Abstract This manuscript introduces a new approach for increasing the efficiency of automatic differentiation (ad) computations for estimating the first order derivatives comprising the Jacobian matrix of a complex largescale computational model. The objective is to approximate the entire Jacobian matrix with minimized computational and storage resources. This is achieved by finding low rank approximations to a Jacobian matrix via the Efficient Subspace Method (ESM). Low rank Jacobian matrices arise in many of today's important scientific and engineering problems, e.g. nuclear reactor calculations, weather climate modeling, geophysical applications, etc. A low rank approximation replaces the original Jacobian matrix J (whose size is dictated by the size of the input and output data streams) with matrices of much smaller dimensions (determined by the numerical rank of the Jacobian matrix). This process reveals the rank of the Jacobian matrix and can be obtained by ESM via a series of r randomized matrixvector products of the form: Jq, and J^T ω which can be evaluated by the ad forward and reverse modes, respectively. 
CrossReferences Bischof2008AiA 
AD Tools OpenAD 
AD Theory and Techniques Jacobianvector product 
BibTeX
@INCOLLECTION{
AbdelKhalik2008ALR,
title = "A Low Rank Approach to Automatic Differentiation",
doi = "10.1007/9783540689423_6",
author = "Hany S. AbdelKhalik and Paul D. Hovland and Andrew Lyons and Tracy E. Stover and
Jean Utke",
abstract = "This manuscript introduces a new approach for increasing the efficiency of
automatic differentiation (AD) computations for estimating the first order derivatives comprising
the Jacobian matrix of a complex largescale computational model. The objective is to approximate
the entire Jacobian matrix with minimized computational and storage resources. This is achieved by
finding low rank approximations to a Jacobian matrix via the Efficient Subspace Method (ESM). Low
rank Jacobian matrices arise in many of today's important scientific and engineering problems,
e.g. nuclear reactor calculations, weather climate modeling, geophysical applications, etc. A low
rank approximation replaces the original Jacobian matrix $J$ (whose size is dictated by the size of
the input and output data streams) with matrices of much smaller dimensions (determined by the
numerical rank of the Jacobian matrix). This process reveals the rank of the Jacobian matrix and can
be obtained by ESM via a series of r randomized matrixvector products of the form: $Jq$, and $J^{T}
\omega$ which can be evaluated by the AD forward and reverse modes, respectively.",
crossref = "Bischof2008AiA",
pages = "5565",
booktitle = "Advances in Automatic Differentiation",
publisher = "Springer",
editor = "Christian H. Bischof and H. Martin B{\"u}cker and Paul D. Hovland and Uwe
Naumann and J. Utke",
isbn = "9783540689355",
issn = "14397358",
year = "2008",
ad_theotech = "Jacobianvector product",
ad_tools = "OpenAD"
}
 
back

