Skip to content

Deep Learning for Julia

Photo of Peter Morgan
Hosted By
Peter M.
Deep Learning for Julia

Details

This month's meetup is a collaboration with the London Julia User Group meetup, taking place at the Microsoft Reactor. We will be covering two of Julia's deep learning libraries, Flux.jl and TensorFlow.jl presented by the creators/core developers. We will see how these frameworks were built, how to use them, and discuss some of the recent advances in that area, including papers presented at NeurIPS.

6:30pm: Pizza

7pm: Lyndon White
Lyndon is a core contributor and maintainer of numerous Julia language packages including DataStructures.jl, TensorFlow.jl, and DataDeps.jl. He used Julia throughout his PhD on NLP at the University of Western Australia. His key research areas include natural language processing / computational linguistics, with a focus on the capturing of sentence meaning. Lyndon’s overarching goal is to develop methods to allow for better computational processing of the vast amounts of data which has been prepared for manual (human) processing, e.g., books, blogs, newspapers. He currently works as a Research Engineer at Invenia Labs in Cambridge whose Advisors include Doyne Farmer and Zoubin Ghahramani. Lyndon is coauthor of the recent book, "Neural Representations of Natural Language", published by Springer.

Abstract – TensorFlow.jl and other tools for ML in Julia
Julia is a fantastic language for machine learning, this talk will explain why and how. It will have something for everyone, from ML practitioner to package developer

  • How TensorFlow.jl was created and how it works
  • How to use TensorFlow.jl within the Julia Ecosystem to solve problems with ML
  • Why Julia is great language to implement or wrap a data science tool
  • The language features, the ecosystem and the community.

This talk assumes little knowledge of the TensorFlow language. A passing familiarity with neural networks and with the Julia programming language will be helpful. Further information on TensorFlow.jl can be found in the recently published paper, “TensorFlow.jl: An Idiomatic Julia Front End for TensorFlow“, JOSS, 1 Nov, 2018 http://joss.theoj.org/papers/10.21105/joss.01002
and also on GitHub: https://github.com/malmaud/TensorFlow.jl .

8pm: Mike Innes
Mike Innes is a software developer at Julia Computing, where he created and develops the Flux machine learning library, as well as several other Julia packages. Previously he worked for MIT, where he made the Juno IDE widely used by Julia users (http://junolab.org).

Abstract – Flux.jl
Deep learning is a rapidly evolving field, and models are increasingly complex. Recently, researchers have begun to explore "differentiable programming", a powerful way to combine neural networks with traditional programming. Differentiable programs may include control flow, functions and data structures, and can even incorporate ray tracers, simulations and scientific models, giving us unprecedented power to find subtle patterns in our data. This talk will explore how this technique, and particularly Flux – a state-of-the-art deep learning library – is impacting the machine learning world, including discussion of our recent papers from NeurIPS 2018. We'll outline important recent work and show how Flux allows us to easily combine neural networks with tools like differential equations solvers.

Further information on Flux.jl can be found here, “Flux: Elegant machine learning with Julia” JOSS, 3 May, 2018 http://joss.theoj.org/papers/10.21105/joss.00602
and also on the website: https://fluxml.ai.

Photo of AI Performance Engineering Meetup (UK) group
AI Performance Engineering Meetup (UK)
See more events