
Programme of the Seventh Euro AD Workshop
Monday, November 24, 2008
 10^{00} –10^{30} Coffee and Welcome
 10^{30} –12^{30} AD Tools, Theory and Applications
 Sebastian Walter (HU Berlin)
Higher Order Forward and Reverse Mode onMatrices with Application to Optimal ExperimentalDesign
 John Pryce (Cranfield University)
A standard for interval arithmetic: IEEE working group P1788
 Johannes Willkomm (RWTH Aachen University)
Comparison of contiuous and discrete adjoints in the IHCP
We apply ADOLC in reverse mode on the C++ finite element code DROPS. The derivative is needed for an optimization in the context of parameter estimation. We report on the results such as the computational efficiency and the convergence behaviour of the objective function.
 Paul Hovland (Argonne National Laboratory)
Discussion on differentiated BLAS
To follow
 12^{30} –14^{00} Lunch
 14^{00} –15^{30} AD Tools, Theory and Applications
 Andrea Walther (TU Dresden)
Parallel Computation of Greeks Using ADOLC
After discussing the recent features integrated in ADOLC version 2.0, the talk presents the handling of OpenMP parallelized function evaluations by ADOLC for a derivative calculation also in parallel. As a numerical example, we employ the libor code of Mike Giles. It is extended to a parallel evaluation of the path sensitivities that are required in a realworld application. Preliminary results on timings are given.
 Laurent Hascoet (INRIA SophiaAntipolis, France)
DataFlow reversal for PassbyValue parameters
The Automatic Differentiation tool Tapenade shares a
common kernel for Fortran or C source programs. This
kernel must therefore handle the PassbyValue behavior
in addition to the former PassbyValueReturn or PassbyReference. This has interesting consequences
in the context of the reverse mode of AD. In the general
case, we show that CallbyValue arguments require the
creation of two differentiated variables. In some cases
however, only one differentiated variable is enough.
We are able to specify and detect these cases in terms
of simple DataFlow properties that the AD tool analyses
automatically.
 Jan Riehme (RWTH Aachen University)
CompAD II & III  The ADenabled NAGWare Fortran 95 compiler
We report about the results of the just finished CompADII
project that aimed to incorporate basic source code
transformation capabilities for adjoint code generation
into an industrial strength Fortran 95 compiler.
We will show a number of successful applications of the
ADenabled compiler, and discuss the further development
towards efficient adjoints in the upcoming COMPADIII
project.
 15^{30} –16^{00} Coffee
 16^{00} –18^{00} Special Session: Applications and Challenges in Economics and Finance
 Luca Guerrieri (Federal Reserve)
Use of AD for maximum likehood estimation
A key application of automatic differentiation (AD) is to facilitate
numerical optimization problems. Such problems are at the core of many estimation techniques, including maximum likelihood. As one of the first applications of AD in the field of economics, we used Tapenade to construct derivatives for the likelihood function of any
linear or linearized general equilibrium model solved under the assumption of rational expectations.
We view our main contribution as providing an important check on finitedifference numerical derivatives. We also construct Monte Carlo experiments to compare maximumlikelihood estimates obtained with and without the aid of automatic derivatives. We find that the convergence rate of our optimization algorithm can increase substantially when we use AD derivatives.
 Luca Capriotti (Credit Suisse)
No more bumping: the promise and challenges of Adjoint Algorithmic Differentiation
Hedging complex derivative securities typically requires estimating the sensitivity of their price with respect to a large number of risk factors. In the common case in which a portfolio of derivatives is priced by means of numerical techniques, like partial differential equations or Monte Carlo simulations, such sensitivities are often obtained by perturbing (bumping) each risk factor in turn
and repeating the price calculation in order to form standard finite
difference estimators. As a result, estimating the risk of such securities is a computationally demanding task that often requires the use of large parallel computers. Adjoint Algorithmic Differentiation (AAD), recently introduced to the computational finance community, represents a much more efficient approach often resulting in computational savings of orders of magnitude. In this talk, I present an overview of some of the applications of AAD to risk management and I discuss some of the challenges of its practical implementation in a Financial Industry setting.
 Thomas Kaminski (FastOpt)
Efficient Computation of HedgeSensitivities via Automatic Differentiation
 Benjamin Skrainka (University College London)
Automating the Implicit Function Theorem in Economics
Many economics models reduce to the solution of a
finitedimensional system of nonlinear but smooth equations.
Furthermore, there are often special cases where one can compute the
solution. The implicit function theorem can be used to compute Taylor
series expansions for the solutions in terms of a parameter
representing the distance from the solvable cases, and these Taylor
series have nontrivial radii of convergence when the nonlinear
equations are analytic functions. We illustrate the general idea with
some simple examples coded in Mathematica, and argue that
userfriendly tools using automatic differentiation would have many
users in economics.
 18^{00} Dinner (on your own)

Tuesday, November 25, 2008
 9^{00} –11^{00} Special Session: Applications and Challenges in Economics and Finance
 Hans Skaug (University of Bergen)
Automated Likelihood Based Inference for Stochastic Volatility Models using AD Model Builder
We use the Laplace approximation to fit stochastic volatility (SV) models.
The models are implemented in the software package AD Model Builder which computational
engine involves the use of third order automatic differentiation (AD).
The algorithms will be described and comparison to other estimation
methods will be made both on real and simulated data. It is found that
the approach substantially reduce computation time
relative to Markov chain Monte Carlo (MCMC) algorithms.
 Tore Selland Kleppe (University of Bergen)
Fitting general stochastic volatility models using simulated maximum likelihood and AD
Stochastic volatility (SV) models are used to model the timevarying volatility in financial pricereturn data. They typically consist of an observed price process driven by a latent volatiliy process, where the two processes have a joint Markov structure. Likelihood calculations are made hard by the fact that the latent volatility process needs to be integrated out of the joint likelihood, resulting in integrals over highdimensional spaces.
Laplace importance samplers have successfully been applied for likelihood calculations in the lognormal SV model. In the work presented here, we provide a framework that extend the class of SV models that can efficiently fitted using a change of integration variable and Laplace importance samplers.
Calculations in this framework involves the maximization and Hessian evaluation of a highly nontrivial function. To implement this, we use several of the techniques in the AD catalog, including mixed forward and backward AD, local preaccumulation and AD applied to iterative solvers.
In the talk, we give an outline of the computational algorithm with emphasis on AD issues, and briefly discuss an application to real data.
 Olivier Pironneau (UPMC  Paris VI)
Experience with A.D. in Finance
I have been using my own c++ toolbox and my student's toolbox called fad which uses traits and expression templates. Comparisons with the backward mode and Odyssee will be given and applications to Greeks and calibration.
 Alex Prideaux (Oxford University)
Use of adjoint methods with computational finance PDEs
 11^{00} –11^{30} Coffee
 11^{30} –13^{00} AD for MATLAB
 Cosmin Bocaniala (Cranfield University)
AD for Collision Avoidance Trajectory Optimisation
Providing Unmanned Aerial Vehicles (UAVs) with collision avoidance capabilities that are able to ensure at least the same level of safety currently provided by human pilots constitutes one of the main requirements of the national/international aviation regulatory bodies in order to allow UAVs routine operation in nonsegregated airspace.
This talk discusses the use of AD tools within a novel technique, Gradientaided Swarm Optimization (GRASP), employed to adjust ideal collision avoidance trajectory solutions, provided by the KB3D geometric approach, to a realistic UAV kinematic model and a realistic radar sensor model.
 Andre Vehreschild (RWTH Aachen University)
ADiMat: On the way to compute derivatives in MATLAB efficiently.
ADiMat enables the computation of derivatives for MATLAB codes in two ways: 1. use a MATLAB class to compute a number of desired directional derivatives during one execution of the differentiated program, 2. execute the differentiated program multiple times seeding only one directional derivative during each execution. In my talk I will describe the two approaches and present some performance results.
 Marina Menshikova (Cranfield University)
Software for uncertainty estimation using AD in Matlab
Computational models have long been used to predict the performance of
some baseline design given its design parameters. Given
inconsistencies in manufacturing, the manufactured product always
deviates from the baseline design. There is currently much interest in
both evaluating the effects of variability in design parameters on a
design's performance (uncertainty estimation), and robust optimization
of the baseline design such that near optimal performance is obtained
despite variability in design parameters. Traditionally, uncertainty
analysis is performed by expensive MonteCarlo methods. This work
presents the software package developed for computing statistical
moments in Matlab using automatic differentiation tool. It explains
the mathematical background, introduces the interface, and
demonstrates the performance based on several examples.
 13^{00} –14^{00} Lunch
 14^{00} Close

