

Using Automatic Differentiation for Secondorder Matrixfree Methods in PDEConstrained Optimization
incollection
  

Author(s)
David E. Keyes
, Paul D. Hovland
, Lois C. McInnes
, Widodo Samyono

Published in Automatic Differentiation of Algorithms: From Simulation to Optimization

Editor(s) George Corliss, Christèle Faure, Andreas Griewank, Laurent Hascoët, Uwe Naumann 
Year 2002 
Publisher Springer 
Abstract Classical methods of constrained optimization are often based on the assumptions that projection onto the constraint manifold is routine, but accessing secondderivative information is not. Both assumptions need revision for the application of optimization to systems constrained by partial differential equations, in the contemporary limit of millions of state variables and in the parallel setting. Largescale PDE solvers are complex pieces of software that exploit detailed knowledge of architecture and application and cannot easily be modified to fit the interface requirements of a blackbox optimizer. Furthermore, in view of the expense of PDE analyses, optimization methods not using second derivatives may require too many iterations to be practical. For general problems, automatic differentiation is likely to be the most convenient means of exploiting second derivatives. We delineate a role for automatic differentiation in matrixfree optimization formulations involving Newton's method, in which little more storage is required than that for the analysis code alone. 
CrossReferences Corliss2002ADo 
BibTeX
@INCOLLECTION{
Keyes2002UAD,
author = "David E. Keyes and Paul D. Hovland and Lois C. McInnes and Widodo Samyono",
title = "Using Automatic Differentiation for Secondorder Matrixfree Methods in
{PDE}Constrained Optimization",
pages = "3550",
chapter = "3",
crossref = "Corliss2002ADo",
booktitle = "Automatic Differentiation of Algorithms: From Simulation to Optimization",
year = "2002",
editor = "George Corliss and Christ{\`e}le Faure and Andreas Griewank and Laurent
Hasco{\"e}t and Uwe Naumann",
series = "Computer and Information Science",
publisher = "Springer",
address = "New York, NY",
abstract = "Classical methods of constrained optimization are often based on the assumptions
that projection onto the constraint manifold is routine, but accessing secondderivative information
is not. Both assumptions need revision for the application of optimization to systems constrained by
partial differential equations, in the contemporary limit of millions of state variables and in the
parallel setting. Largescale PDE solvers are complex pieces of software that exploit detailed
knowledge of architecture and application and cannot easily be modified to fit the interface
requirements of a blackbox optimizer. Furthermore, in view of the expense of PDE analyses,
optimization methods not using second derivatives may require too many iterations to be practical.
For general problems, automatic differentiation is likely to be the most convenient means of
exploiting second derivatives. We delineate a role for automatic differentiation in matrixfree
optimization formulations involving Newton's method, in which little more storage is required
than that for the analysis code alone.",
referred = "[More2002ADT]."
}
 
back

