Deep Learning for Julia


Details
We start 2019 with an exploration of deep learning in Julia. We will present two deep learning tools available Julia -- Tensorflow.jl, a wrapper around Google's Tensorflow, and Flux.jl, an elegant pure Julia deep learning library. We will see how they were built, how to use them, and discuss some of the recent advances in that area, including our papers at NeurIPS.
Schedule:
18:30 Pizza and Networking
19:00 Talks Start
21:00 Talks End
7 PM : Lyndon White - TensorFlow.jl and other tools for ML in Julia
Lyndon is a core contributor and maintainer of numerous Julia language packages including DataStructures.jl, TensorFlow.jl, and DataDeps.jl. He used Julia throughout his PhD on NLP at the University of Western Australia. His key research areas include natural language processing / computational linguistics, with a focus on the capturing of sentence meaning. Lyndon’s overarching goal is to develop methods to allow for better computational processing of the vast amounts of data which has been prepared for manual (human) processing, e.g., books, blogs, newspapers. He currently works as a Research Engineer at Invenia Labs in Cambridge whose Advisors include Doyne Farmer and Zoubin Ghahramani. Lyndon is coauthor of the recent book, "Neural Representations of Natural Language", published by Springer.
Abstract:
Julia is a fantastic language for machine learning, this talk will explain why and how. It will have something for everyone, from ML practitioner to package developer:
- How TensorFlow.jl was created and how it works
- How to use TensorFlow.jl within the Julia Ecosystem to solve problems with ML
- Why Julia is great language to implement or wrap a data science tool
- The language features, the ecosystem and the community.
This talk assumes little knowledge of the TensorFlow language. A passing familiarity with neural networks and with the Julia programming language will be helpful. Further information on TensorFlow.jl can be found in the recently published paper, “TensorFlow.jl: An Idiomatic Julia Front End for TensorFlow“, JOSS, 1 Nov, 2018 http://joss.theoj.org/papers/10.21105/joss.01002 and also on GitHub: https://github.com/malmaud/TensorFlow.jl .
8 PM : Mike Innes – Flux.jl
Mike Innes is a software at Julia Computing, where he created and develops the Flux machine learning library, as well as several other Julia packages. Previously he worked for MIT, where he made the Juno IDE widely used by Julia users.
Abstract:
Deep learning is a rapidly evolving field, and models are increasingly complex. Recently, researchers have begun to explore "differentiable programming", a powerful way to combine neural networks with traditional programming. Differentiable programs may include control flow, functions and data structures, and can even incorporate ray tracers, simulations and scientific models, giving us even unprecedented power to find subtle patterns in our data.
This talk will explore how this technique, and particularly Flux – a state-of-the-art deep learning library – is impacting the machine learning world, including discussion of our recent papers from NeurIPS 2018. We'll outline important recent work and show how Flux allows us to easily combine neural networks with tools like differential equations solvers.
This event is co-hosted with the London Deep Learning Lab (https://www.meetup.com/Deep-Learning-Lab/events/257404837/)
Image credit: Packed butterflies by @cormullion : https://flic.kr/p/2dTeeqM

Deep Learning for Julia