Publication: An introduction to algorithmic differentiation
Introduction
Applications
Tools
Research Groups
Workshops
Publications
   List Publications
   Advanced Search
   Info
   Add Publications
My Account
About

An introduction to algorithmic differentiation

- Article in a journal -
 

Author(s)
Assefaw H. Gebremedhin , Andrea Walther

Published in
WIREs Data Mining and Knowledge Discovery

Year
2020

Abstract
Algorithmic differentiation (ad), also known as automatic differentiation, is a technology for accurate and efficient evaluation of derivatives of a function given as a computer model. The evaluations of such models are essential building blocks in numerous scientific computing and data analysis applications, including optimization, parameter identification, sensitivity analysis, uncertainty quantification, nonlinear equation solving, and integration of differential equations. We provide an introduction to ad and present its basic ideas and techniques, some of its most important results, the implementation paradigms it relies on, the connection it has to other domains including machine learning and parallel computing, and a few of the major open problems in the area. Topics we discuss include: forward mode and reverse mode of ad, higher-order derivatives, operator overloading and source transformation, sparsity exploitation, checkpointing, cross-country mode, and differentiating iterative processes.

AD Theory and Techniques
Introduction

BibTeX
@ARTICLE{
         Gebremedhin2020Ait,
       author = "Gebremedhin, Assefaw H. and Walther, Andrea",
       title = "An introduction to algorithmic differentiation",
       journal = "WIREs Data Mining and Knowledge Discovery",
       volume = "10",
       number = "1",
       pages = "e1334",
       keywords = "adjoints, algorithmic differentiation, automatic differentiation, backpropagation,
         checkpointing, sensitivities",
       doi = "10.1002/widm.1334",
       url = "https://onlinelibrary.wiley.com/doi/abs/10.1002/widm.1334",
       eprint = "https://onlinelibrary.wiley.com/doi/pdf/10.1002/widm.1334",
       abstract = "Algorithmic differentiation (AD), also known as automatic differentiation, is a
         technology for accurate and efficient evaluation of derivatives of a function given as a computer
         model. The evaluations of such models are essential building blocks in numerous scientific computing
         and data analysis applications, including optimization, parameter identification, sensitivity
         analysis, uncertainty quantification, nonlinear equation solving, and integration of differential
         equations. We provide an introduction to AD and present its basic ideas and techniques, some of its
         most important results, the implementation paradigms it relies on, the connection it has to other
         domains including machine learning and parallel computing, and a few of the major open problems in
         the area. Topics we discuss include: forward mode and reverse mode of AD, higher-order derivatives,
         operator overloading and source transformation, sparsity exploitation, checkpointing, cross-country
         mode, and differentiating iterative processes.",
       year = "2020",
       ad_theotech = "Introduction"
}


back
  

Contact:
autodiff.org
Username:
Password:
(lost password)