addressalign-toparrow-leftarrow-rightbackbellblockcalendarcameraccwcheckchevron-downchevron-leftchevron-rightchevron-small-downchevron-small-leftchevron-small-rightchevron-small-upchevron-upcircle-with-checkcircle-with-crosscircle-with-pluscontroller-playcrossdots-three-verticaleditemptyheartexporteye-with-lineeyefacebookfolderfullheartglobegmailgooglegroupshelp-with-circleimageimagesinstagramFill 1light-bulblinklocation-pinm-swarmSearchmailmessagesminusmoremuplabelShape 3 + Rectangle 1ShapeoutlookpersonJoin Group on CardStartprice-ribbonprintShapeShapeShapeShapeImported LayersImported LayersImported Layersshieldstartickettrashtriangle-downtriangle-uptwitteruserwarningyahoo

"Fast Greeks through Adjoint Algorithmic Differentiation"­

This is a free seminar, open to anyone from business or academia interested in financial engineering.

The Wilmott Forum, in partnership with NAG and the Certificate in Quantitative Finance (CQF) Program, will be hosting a Finance Focus Event in on Thurs July 26th. Live in NYC and also via webcast.

Keynote Speakers:

Professor Uwe Naumann - RWTH Aachen University

"Fast Greeks through Adjoint Algorithmic Differentiation - and Further Speed-up through Mathematical and Structural Insight"

Derivatives of various objectives with respect to potentially large numbers of free parameters are crucial ingredients of many modern numerical algorithms.

Parameter calibration methods based on implementations of highly sophisticated mathematical models as computer programs are of fundamental interest in Computational Finance. Algorithmic Differentiation (AD) transforms the given computer programs into code for the computation of first (gradients, Jacobians) as well as second (Hessians) and higher derivatives.

Adjoint AD allows for gradients to be computed with a computational cost that is independent of their sizes. For example, let the evaluation of a given function of one hundred parameters take one minute on your favorite computer. A sequential first-order finite difference approximation of the one hundred gradient entries takes about one hundred minutes. Adjoint AD delivers the same gradient with machine accuracy within less than ten minutes. Similar complexity results hold for second and higher derivatives.

In this talk we review the fundamental ideas behind (adjoint) AD and we present software tools that support the semi-automatic generation of derivative code with special focus on C/C++. Further gains in robustness and computational complexity result from the exploitation of additional mathematical and structural insight. In particular, we discuss AD of numerical methods that are likely to be embedded into the given simulation code and concurrency in the context of adjoint AD.

Professor Uwe Naumann has been a professor for Computer Science at RWTH Aachen University, Germany, since 2004 and a member of NAG since 2009. After obtaining a Ph.D. in Applied Mathematics from the Technical University Dresden, Germany, in 1999 and prior to joining RWTH he held post-doctoral positions at INRIA, France, the University of Hertfordshire, UK, and Argonne National Laboratory, USA.

Uwe's research and development activities are motivated by derivative-based numerical simulation and optimization in Computational Science and Engineering. His main focus lies on first- and higher-order discrete adjoint methods with major applications in the physical and engineering sciences and in computational finance. He is the author of "The Art of Differentiating Computer Programs. An Introduction to Algorithmic Differentiation." published by SIAM in 2012.

Uwe has worked as a NAG consultant delivering training and software solutions to customers for a variety of problems. He has contributed to the development of the NAG Libraries and NAG Fortran Compiler.

Uwe has given many public talks over his career, most recently he headlined at the 2012 Global Derivatives Trading & Risk Management conference in Barcelona

John Holden - The Numerical Algorithms Group (NAG)

"Latest releases and news from NAG"

Key new mathematical functionality and technology developments will be shared. Highlights include the launch of the latest release of the NAG C Library and NAG Toolbox for MATLAB®"

Senior quant from Tier 1 Investment Bank commenting on Mark 23 releases of the NAG Toolbox for MATLAB and NAG C Library :

"We deploy production code in C++ embedding NAG C Library functions wherever we can, but often proto-type new models in MATLAB before writing our C++ code, having the same NAG algorithms in MATLAB via the NAG Toolbox for MATLAB is a real win for us. Thank you, thank you.

"Obvious finance highlights from my point of view: i) matrix functions (especially exponential) ii) additional nearest correlation matrix functions iii) skip ahead for Mersenne Twister as well as new local and global optimisation functions"

John Holden is NAG's Global Lead for the Financial Services Industry

To register your interest in attending this event, please email Liz Galliford at [masked] before Wednesday 25th July either "Webcast" or "Classroom" in the subject of your email. You will receive a confirmation email next week containing the details of the lecture (including the lecture timings) and in the case of webcast, joining instructions for the session.

Join or login to comment.

1 went

  • A former member

People in this
Meetup are also in:

Sign up

Meetup members, Log in

By clicking "Sign up" or "Sign up using Facebook", you confirm that you accept our Terms of Service & Privacy Policy