Publication: A vector forward mode of automatic differentiation for generalized derivative evaluation
Introduction
Applications
Tools
Research Groups
Workshops
Publications
   List Publications
   Advanced Search
   Info
   Add Publications
My Account
About

A vector forward mode of automatic differentiation for generalized derivative evaluation

- Article in a journal -
 

Author(s)
Kamil A. Khan , Paul I. Barton

Published in
Optimization Methods and Software

Year
2015

Abstract
Numerical methods for non-smooth equation-solving and optimization often require generalized derivative information in the form of elements of the Clarke Jacobian or the B-subdifferential. It is shown here that piecewise differentiable functions are lexicographically smooth in the sense of Nesterov, and that lexicographic derivatives of these functions comprise a particular subset of both the B-subdifferential and the Clarke Jacobian. Several recently developed methods for generalized derivative evaluation of composite piecewise differentiable functions are shown to produce identical results, which are also lexicographic derivatives. A vector forward mode of automatic differentiation (ad) is presented for evaluation of these derivatives, generalizing established methods and combining their computational benefits. This forward ad mode may be applied to any finite composition of known smooth functions, piecewise differentiable functions such as the absolute value function, , and , and certain non-smooth functions which are not piecewise differentiable, such as the Euclidean norm. This forward ad mode may be implemented using operator overloading, does not require storage of a computational graph, and is computationally tractable relative to the cost of a function evaluation. An implementation in C is discussed.

AD Theory and Techniques
Generalized Jacobian

BibTeX
@ARTICLE{
         Khan2015Avf,
       author = "Kamil A. Khan and Paul I. Barton",
       title = "A vector forward mode of automatic differentiation for generalized derivative
         evaluation",
       journal = "Optimization Methods and Software",
       volume = "30",
       number = "6",
       pages = "1185--1212",
       year = "2015",
       doi = "10.1080/10556788.2015.1025400",
       url = "http://dx.doi.org/10.1080/10556788.2015.1025400",
       abstract = "Numerical methods for non-smooth equation-solving and optimization often require
         generalized derivative information in the form of elements of the Clarke Jacobian or the
         B-subdifferential. It is shown here that piecewise differentiable functions are lexicographically
         smooth in the sense of Nesterov, and that lexicographic derivatives of these functions comprise a
         particular subset of both the B-subdifferential and the Clarke Jacobian. Several recently developed
         methods for generalized derivative evaluation of composite piecewise differentiable functions are
         shown to produce identical results, which are also lexicographic derivatives. A vector forward mode
         of automatic differentiation (AD) is presented for evaluation of these derivatives, generalizing
         established methods and combining their computational benefits. This forward AD mode may be applied
         to any finite composition of known smooth functions, piecewise differentiable functions such as the
         absolute value function, , and , and certain non-smooth functions which are not piecewise
         differentiable, such as the Euclidean norm. This forward AD mode may be implemented using operator
         overloading, does not require storage of a computational graph, and is computationally tractable
         relative to the cost of a function evaluation. An implementation in C is discussed.",
       ad_theotech = "Generalized Jacobian"
}


back
  

Contact:
autodiff.org
Username:
Password:
(lost password)