AD Tools for ALL
Introduction
Applications
Tools
Research Groups
Workshops
Publications
My Account
About

Show tools for a specific language:

63 tools found

Alphabetical List of tools

  • ad  ( python )
    Transparent, calculator-style first and second-order derivatives.

  • AD Model Builder  ( C/C++ )
    AD Model Builder (ADMB) was specifically designed for complex highly-parameterized nonlinear models. ADMB uses automatic differentiation to provide the function optimizer with exact derivatives.

  • AD4CL  ( C/C++,OpenCL )
    Automatic Differentiation for GPU computing

  • ADC  ( C/C++ )
    The vivlabs ADC Automatic Differentiation Software for C/C++ delivers rapid integration of automatic differentiation capability to your new and existing applications on all operating system platforms. ADC automatically exploits the sparsity within your equation matrices, which leads to winning performance for both small, large and extremely large applications.

  • ADEL  ( C/C++ )
    ADEL is an open-source C++ template library for Automatic Differentiation in forward mode. Works with CUDA out of the box.

  • Adept  ( C/C++ )
    Adept is an operator-overloading implementation of first-order forward- and reverse-mode automatic differentiation. It is very fast thanks to its use of expression templates and a very efficient tape structure: in reverse mode it is typically only 2.5-4 times slower than the original undifferentiated algorithm. It is released under the Apache License, Version 2.0.

  • Adept  ( C/C++ )
    Adept is an operator-overloading implementation of first-order forward- and reverse-mode automatic differentiation. It is very fast thanks to its use of expression templates and a very efficient tape structure: in reverse mode it is typically only 2.5-4 times slower than the original undifferentiated algorithm. It is released under the Apache License, Version 2.0.

  • ADF  ( Fortran77,Fortran95 )
    The vivlabs ADF Automatic Differentiation Software for FORTRAN delivers rapid integration of automatic differentiation capability to your new and existing applications on all operating system platforms. ADF automatically exploits the sparsity within your equation matrices, which leads to winning performance for both small, large and extremely large applications.

  • ADG  ( Fortran 77/90,Fortran77,Fortran95 )
    The Adjoint Code Generator (ADG) is a source-to-source transformation tool that is used for generating the adjoint model. Designed with the Least Program Behavior Decomposition Method, ADG supports global data dependent analysis and code optimization at a statement class.

  • ADIC  ( C/C++ )
    ADIC is a tool for the automatic differentiation (AD) of programs written in ANSI C. First derivatives are computed using forward mode with statement level preaccumulation. Second derivatives are computed using one of several forward mode strategies.

  • ADIFOR  ( Fortran77 )
    Given a Fortran 77 source code and a user's specification of dependent and independent variables, ADIFOR will generate an augmented derivative code that computes the partial derivatives of all of the specified dependent variables with respect to all of the specified independent variables in addition to the original result.

  • ADiGator  ( MATLAB )
    Given a user function program together with information pertaining to the inputs of the program, ADiGator performs source transformation via the overloaded CADA class to generate any order derivative code.

  • ADiJaC  ( Java )
    ADiJaC uses source code transformation to generate derivative codes in both the forward and the reverse modes of automatic differentiation.

  • ADiMat  ( MATLAB )
    ADiMat uses a hybrid approach of source transformation and object orientied programming techniques to compute first and second order derivatives of MATLAB programs.

  • ADMAT / ADMIT  ( MATLAB )
    ADMAT 2.0 enables you to differentiate MATLAB functions, and allows you to compute gradients, Jacobian matrices and Hessian matrices of nonlinear maps defined via M-files. Both forward and reverse modes are included.

  • ADNumber  ( C/C++ )
    Automatic differentiation of arbitrary order to machine precision. Uses templates and expression trees.

  • ADOL-C  ( C/C++ )
    The package ADOL-C facilitates the evaluation of first and higher derivatives of vector functions that are defined by computer programs written in C or C++. The resulting derivative evaluation routines may be called from C/C++, Fortran, or any other language that can be linked with C. ADOL-C is distributed by the COIN-OR Foundation with the Common Public License CPL or the GNU General Public License GPL.

  • ADOL-F  ( Fortran95 )
    The tool ADOL-F was an early attempt to use the overloading capabilities newly introduced to Fortran to create an execution trace. The idea was to replicate the format of the ADOL-C execution trace (aka the "tape") so that one could reuse the ADOL-C drivers to do the derivative computation. Because of the lack of a "destructor" for the active type that enables the execution trace, there was no means to curtail the growth of active locations (see ADOL-C). The tool is no longer maintained and listed here just to keep the record complete.

  • APMonitor  ( Interpreted )
    The APMonitor Modeling Language is an interpreted language for algebraic and differential equations. As an interpreted language, it has the ability to provide analytic derivatives to almost any programming language.

  • April-ANN  ( Lua )
    April-ANN toolkit (A Pattern Recognizer In Lua with Artificial Neural Networks). This toolkit incorporates ANN algorithms, with other pattern recognition methods as hidden markov models (HMMs) among others. Currently, in experimental stage, it is possible to perform automatic differentiation, for advanced machine learning research. Feel free to contact us in case you want to collaborate into the development of the autodiff package.

  • AuDi  ( C/C++,python )
    AuDI is an open source, header only, C++ library that allows for AUtomated DIfferentiation implementing a Taylor truncated polynomial algebra (aka differential algebra). Its core is also exposed as a python module called pyaudi.

  • AUTODIF  ( C/C++ )
    A C++ library for automatic differentiation used as the building block for AD Model Builder

  • AutoDiff .NET  ( .NET )
    A simple .NET library for evaluating the value/gradient of a function using reverse-mode automatic differentiation.

  • AutoDiff_Library  ( C/C++ )
    This standalone AD library builds the computational graph and performs reverse gradient as well as reverse Hessian and Hessian-vector product algorithms on the graph. It is currently used in the parallel implementation of the Structured Modelling Language (http://www.maths.ed.ac.uk/ERGO/sml).

  • AUTO_DERIV  ( Fortran77,Fortran95 )
    AUTO_DERIV is a Fortran 90 module which can be used to evaluate the first and second derivatives of any continuous function with any number of independent variables. The function can be implicitly encoded in Fortran 77/90; only slight modifications in user code are required.

  • CasADi  ( C/C++,python )
    CasADi is a symbolic framework for numeric optimization implementing automatic differentiation in forward and reverse modes on sparse matrix-valued computational graphs.

  • CoDiPack  ( C/C++ )
    CoDiPack is a C++-library that enables the computation of gradients in computer programs using Algorithmic Differentiation. It is based on the Operator Overloading approach and uses static polymorphism and expression templates, resulting in an extremely fast evaluation of adjoints or forward derivatives. It is specifically designed with HPC applications in mind.

  • COJAC  ( Java )
    COJAC uses bytecode instrumentation to automatically enrich floats/doubles at runtime; the prototype offers both forward and reverse mode AD. The idea is presented in this short video: https://youtu.be/eAy71M34U_I?list=PLHLKWUtT0B7kNos1e48vKhFlGAXR1AAkF

  • ColPack  ( C/C++ )
    ColPack is a package consisting of implementations of various graph coloring and related algorithms for compression-based computation of sparse Jacobian and Hessian matrices using an Automatic Differentiation tool. ColPack is currently interfaced with ADOL-C. The coloring capabilities can be used for purposes other than derivative matrix computation.

  • COSY INFINITY  ( Fortran77,Fortran95,C/C++ )
    COSY is an open platform to support automatic differentiation, in particular to high order and in many variables. It also supports validated computation of Taylor models. The tools can be used as objects in F95 and C++ and through direct calls in F77 and C, as well as in the COSY scripting language which supports dynamic typing.

  • CppAD  ( C/C++ )
    CppAD uses operator overloading to compute derivatives of algorithms defined in C++. It is distributed by the COIN-OR Foundation with the Common Public License CPL or the GNU General Public License GPL. Installation procedures are provided for both Unix and Windows operating systems. The CppAD subversion repository can be used to view the source code. Extensive user and developer documentation is included.

  • CppADCodeGen  ( C/C++ )
    CppADCodeGen aims to extend the CppAD library in order to perform hybrid automatic differentiation, that is, to use operator overloading and generate/compile source code. Provides easy to use drivers for the generation and use of dynamic libraries under Linux. It also allows JIT compilation through Clang/LLVM. It is distributed under the Eclipse Public License 1.0 or the GNU General Public License 3 GPL.

  • CTaylor  ( C/C++ )
    High performance library to calculate with truncated taylor series. Can use multiple independent variables. Stores only potentially nonzero derivatives. Order of derivatives increases when using nonlinear operations until maximum (parameter) is reached. Based on googles libtaylor and heavily using boost::mpl.

  • CurvFit, app from FC-Compiler  ( Language independent,FortranCalculus )
    CurvFit (tm) is a nonlinear curve fitting program. Sine, damped Sine, Lorentz, Modified Lorentz, Power (ie Polynomial) and Exponential series are presently available models to match your data. We strongly suggest trying a Lorentz series for data with multiple peaks or valleys. A calculator exists for interpolation &/or extrapolation of given data. CurvFit has proven excellent for hard to fit data. Hard to fit data may take more time -but- it can be done given the right series and parameter values. For start try curve fitting your data with a Lorentz series!

  • DFT  ( Fortran 77/90,Fortran77,Fortran95 )
    DFT is a source-to-source transformation tool for generating the tangent linear model, and it supports global data dependent analysis and code optimization at a statement class.

  • DiffSharp  ( .NET,F#,C# )
    DiffSharp is an automatic differentiation (AD) library implemented in the F# language. It supports C# and the other common language infrastructure languages. The library is under active development by Atılım GŁneş Baydin and Barak A. Pearlmutter mainly for research applications in machine learning, as part of their work at the Brain and Computation Lab, Hamilton Institute, National University of Ireland Maynooth. Please visit the project website for detailed documentation and usage examples.

  • FAD  ( C/C++ )
    An implementation of automatic differentiation for programs written in C++ using operator overloading and expression templates.

  • FADBAD/TADIFF  ( C/C++ )
    FADBAD is a C++ library implementing the forward and reverse mode of automatic differentiation by operator overloading for C++ programs. TADIFF is a C++ program package for performing Taylor expansions on functions implemented as C++ programs.

  • FFADLib  ( C/C++ )
    FFADLib implements overloaded C++ arithmetic operators and elementary function that employ fast automatic differentiation algorithms. Such algorithms use precomputed addresses of the derivatives in the data structure.

  • finmath-lib automatic differentiation extensions  ( Java )
    Implementation of a stochastic automatic differentiation (AD / AAD for Monte-Carlo Simulations).

  • FortranCalculus Compiler  ( Fortran77 )
    FortranCalculus (FC) language is for math modeling, simulation, optimization, and parameter tweaking. FortranCalculus is based on Automatic Differentiation (AD) and Operator Overloading that simplify computer code to an absolute minimum; i.e., a mathematical model, constraints, and the objective (function) definition. Minimizing the amount of code allows the user to concentrate on the science or engineering problem at hand and not on the (numerical) process requirements to achieve an optimum solution. There are some 100+ example math problems from industry in the 'demo' section for users to browser, run, and copy for building their own math problem. At least for the next year or two, FC-Compiler is free! Try it, you'll like it.

  • ForwardDiff.jl  ( Julia )
    The ForwardDiff package provides an implementation of forward-mode automatic differentiation (FAD) in Julia.

  • FunG  ( C/C++ )
    A library for simple and efficient generation of nonlinear functions and its first-, second-, and third-order derivatives. The focus is on invariant-based models, such as in nonlinear elasticity, and functions that pass the assembly process in FE-computations. Supports scalars, vectors, matrices and more general types satisfying a (relaxed) vector space structure.

  • GRESS  ( Fortran77 )
    GRESS (Gradient-Enhanced Software System) reads an existing Fortran code as input and produces an enhanced Fortran code as output. The enhanced code has additional new lines of coding for calculating derivative information analytically but using the rules of calculus. The enhanced model reproduces the reference model calculations and has the additional capability to compute derivatives and sensitivities specified by the user. The user also specifies whether the direct or adjoint method is to be used in computing sensitivities.

  • HSL_AD02  ( Fortran95 )
    Provides automatic differentiation facilities for variables specified by Fortran code. Each active variable must be declared to be of a derived type defined by the package instead of real. The backward method is available for first and second derivatives. The forward method is available for derivatives of any order.

  • INTLAB  ( MATLAB )
    INTLAB is Matlab toolbox for self-validating algorithms.

  • NAGWare Fortran 95   ( Fortran77,Fortran95 )
    The NAGWare Fortran 95 Compiler is being extended to provide AD functionality. The first prototype will be distributed to beta testers by November 2002.

  • OpenAD  ( C/C++,Fortran77,Fortran95 )
    OpenAD is a source transformation tool that provides a language independent framework for the development and use of AD algorithms. It interfaces with language specific front-ends via an XML representation of the numerical core. Currently, Open64 is the front-end for FORTRAN and EDG/Sage3 the front-end for C/C++.

  • PCOMP  ( Fortran77 )
    PCOMP implements the forward and reverse mode for functions written in a FORTRAN-like modeling language, a subset of FORTRAN with a few extensions. First- and second-order derivatives are supported.

  • pyadolc  ( python )
    Python Wrapper of ADOL-C

  • pycppad  ( Interpreted,python )
    A boost ::python interface to the C++ Algorithmic Differentiation package CppAD. The pycppad package is distributed under the BSD license.

  • Rapsodia  ( C/C++,Fortran95 )
    Rapsodia is Python based code generator the creates C++ or Fortran libraries to efficiently compute higher order derivatives via operator overloading.

  • Sacado  ( C/C++ )
    The Sacado package provides automatic differentiation tools for C++ applications and is part of the larger Trilinos framework. It provides both forward and reverse modes, and leverages expression templates in the forward mode and a simplified tape data structure in the reverse mode for improved efficiency.

  • Stan Math Library  ( C/C++ )
    Forward- and reverse-mode implementations for probability, linear algebra, and ODE applications.

  • TAF  ( Fortran 77/90,Fortran2003,Fortran2008,Fortran77,Fortran95 )
    Transformation of Algorithms in Fortran (TAF) is a source-to-source AD-tool for Fortran-95 programs. TAF supports forward and reverse mode of AD and Automatic Sparsity Detection (ASD) for detection of the sparsity structure of Jacobians.

  • TAMC  ( Fortran77 )
    TAMC is a source-to-source AD-tool for FORTRAN-77 programs. The generated code propagates derivatives in forward (tangent linear) or reverse (adjoint) mode. TAMC is very flexible thanks to many options and user directives.

  • TAPENADE  ( C/C++,Fortran77,Fortran95 )
    TAPENADE is a source-to-source AD tool. Given a FORTRAN77, FORTRAN95, or C source program, it generates its derivative in forward (tangent) or reverse (adjoint) mode. TAPENADE is the successor of ODYSSEE. TAPENADE is directly accessible through a web servlet, or can be downloaded locally.

  • TaylUR  ( Fortran95 )
    TaylUR is a Fortran 95 module to automatically compute the numerical values of a complex-valued function's derivatives w.r.t. several variables up to an arbitrary order in each variable, but excluding mixed derivatives.

  • The Taylor Center  ( Delphi,Language independent )
    ODE Solver for Initial Value Problems (IVPs) given in the form of a system of 1st order explicit ODEs. The integration is based on Automatic Differentiation of the right hand sides entered in a conventional mathematical notation. The package is an All-In-One advanced GUI application with near real time animation of the solution in 2D or in 3D stereo with a conventional monitor and Red/Blue glasses.

  • TOMLAB /MAD  ( MATLAB )
    The package TOMLAB /MAD package introduces automatic differentiation for the MATLAB users by operator overloading. TOMLAB /MAD with the TOMLAB Base Module is complete integration for advanced optimization application with more than 100 algorithms available. MAD can also be used as a stand-alone package for the MATLAB user.

  • TOMLAB /TomSym  ( MATLAB )
    TomSym uses MATLAB objects and operator overloading to capture MATLAB procedures, and then generates source code for derivatives of any order. TomSym also integrates with the TOMLAB optimization environment to provide an easy-to-use interface for a broad range of optimization problems.

  • Treeverse / Revolve  ( C/C++,Fortran77,Fortran95 )
    Revolve implements an efficient checkpointing algorithm for the exact computation of a gradient of a functional consisting of a (pseudo) time-stepping procedure.

  • YAO  ( C/C++ )
    YAO is dedicated to the programming of numerical models and data assimilation. It is based on a modulus graph methodology. Each modulus represents a function. YAO facilitates and generates the coding of the linear tangent and the adjoint of the model.

  

LinkedIn:    Contact:
autodiff.org
Username:
Password:
(lost password)