Publication: Automatic Differentiation Applied to Convex Optimization
Introduction
Applications
Tools
Research Groups
Workshops
Publications
   List Publications
   Advanced Search
   Info
   Add Publications
My Account
About

Automatic Differentiation Applied to Convex Optimization

- incollection -
 

Author(s)
Eric Hassold , André Galligo

Published in
Computational Differentiation: Techniques, Applications, and Tools

Editor(s)
Martin Berz, Christian Bischof, George Corliss, Andreas Griewank

Year
1996

Publisher
SIAM

Abstract
In several applications, one is led to minimize a nonlinear function f that is differentiable or at least admits gradients almost everywhere. In this paper, we outline optimization algorithms that rely on explicit computations of gradients or limits of gradients, using specific automatic differentiation techniques. We consider functions represented by Fortran programs. We suppose that singularities are consequences either of ``branching″ operations (absolute value, max, conditional structures with simple tests) or of classical elementary functions (square root when computing an Euclidean norm), which generate kinks where f admits a directional derivative in any direction. We present algorithms and implementations on top of the automatic differentiation system Odyssée to compute directional derivatives and limits of gradients that allow descriptions of normal cones. Together with the input Fortran code, this is used by our optimization library Odymin to minimize f. Finally, we discuss the capability, efficiency, and extensibility of our approach. We compare the number of calls required by different strategies for a classical exampl

Cross-References
Berz1996CDT

BibTeX
@INCOLLECTION{
         Hassold1996ADA,
       author = "Eric Hassold and Andr{\'e} Galligo",
       editor = "Martin Berz and Christian Bischof and George Corliss and Andreas Griewank",
       title = "Automatic Differentiation Applied to Convex Optimization",
       booktitle = "Computational Differentiation: Techniques, Applications, and Tools",
       pages = "287--297",
       publisher = "SIAM",
       address = "Philadelphia, PA",
       key = "Hassold1996ADA",
       crossref = "Berz1996CDT",
       abstract = "In several applications, one is led to minimize a nonlinear function $f$ that is
         differentiable or at least admits gradients almost everywhere. In this paper, we outline
         optimization algorithms that rely on explicit computations of gradients or limits of gradients,
         using specific automatic differentiation techniques. We consider functions represented by Fortran
         programs. We suppose that singularities are consequences either of ``branching''
         operations (absolute value, max, conditional structures with simple tests) or of classical
         elementary functions (square root when computing an Euclidean norm), which generate kinks where $f$
         admits a directional derivative in any direction. We present algorithms and implementations on top
         of the automatic differentiation system Odyss\'ee to compute directional derivatives and
         limits of gradients that allow descriptions of normal cones. Together with the input Fortran code,
         this is used by our optimization library Odymin to minimize $f$. Finally, we discuss the capability,
         efficiency, and extensibility of our approach. We compare the number of calls required by different
         strategies for a classical exampl",
       keywords = "Optimization, nonsmooth optimization, directional Taylor expansion, convex
         optimization, bundle methods, Odyss\'ee.",
       referred = "[Berz2002TaU], [Dignath2002AAa].",
       year = "1996"
}


back
  

Contact:
autodiff.org
Username:
Password:
(lost password)