Skip to content

Details

Automatic differentiation (AD) is a family of techniques that evaluate derivatives at machine precision with only a small constant factor of overhead, by systematically applying the chain rule of calculus at the elementary operator level. This talk will be about the DiffSharp and Hype libraries using the AD technique, both implemented in F#.

Diff Sharp is an AD library for the .NET ecosystem. The library has been designed with machine learning applications in mind, allowing very succinct implementations of models and optimization routines. Diff Sharp is implemented in F# and exposes forward and reverse AD operators as general nestable higher-order functions. It provides high-performance linear algebra primitives (scalars, vectors, and matrices, with a generalization to tensors underway) that are fully supported by all the AD operators, and run using the highly optimized OpenBLAS library. Di ffSharp currently uses operator overloading, but we are developing a transformation-based version of the library using F#'s "code quotations" metaprogramming facility. Work on a CUDA-based GPU backend is also underway.

Diff Sharp will be maintained as a basis library providing an AD infrastructure to .NET languages, independent of the application domain. In addition to setting up this infrastructure, we are interested in using generalized nested AD for implementing machine learning models and algorithms. For this purpose, we started developing the Hype library on top of Di ffSharp. Hype is in early stages of its development and is currently shared as a proof-of-concept for using generalized AD in machine learning and hyperparameter optimization (hence the name, "hype"). It showcases how the combination of nested AD and functional programming allows succinct implementations of optimization routines (e.g., stochastic gradient descent, AdaGrad, RMSProp), and feedforward and recurrent neural networks. Upcoming GPU and tensor support in Diff Sharp is particularly relevant in this application domain, as these are essential to modern deep learning models.

Learn about the Hype library here: https://github.com/hypelib/Hype

Learn about the DiffSharp library here: http://diffsharp.github.io/DiffSharp/

Please register with our hosts Skills Matter: https://skillsmatter.com/meetups/8121-hype-library

Related topics

Sponsors

F# Software Foundation

F# Software Foundation

We are an Affiliated User Group of fsharp.org.

dotNET Foundation

dotNET Foundation

The .NET Foundation supports our group with its Meetup subscription.

You may also like