Deep Learning With Sets & Minimal Newton Solvers For Deep Learning


Details
Please note that Photo ID will be required. Please can attendees ensure their meetup profile name includes their full name to ensure entry.
Agenda:
- 18:30: Doors open, pizza, beer, networking
- 19:00: First talk
- 19:45: Break & networking
- 20:00: Second talk
- 20:45: Close
Sponsors
Man AHL: At Man AHL, we mix machine learning, computer science and engineering with terabytes of data to invest billions of dollars every day.
Evolution AI: Machines that Read - get answers from your text data.
- Title: Deep Learning With Sets (Yan Zhang - University of Southampton)
Abstract: Sets – unordered collections of things – are useful for describing various kinds of data such as point clouds, the set of objects in an image, the set of nodes in a graph, and the set of people who have read this sentence. But how can we build deep neural networks to work with these sets while properly taking their unordered nature into account? In this talk, we cover the fundamentals and some recent advances on how to do this, either with the set as input (set-to-vector) or set as output (vector-to-set). The resulting neural networks can be used for tasks such as graph classification (set of nodes as input) and object detection (set of objects as output).
Bio: Yan Zhang is a PhD student at the University of Southampton who just submitted his thesis titled "Learning to Represent and Predict Sets with Deep Neural Networks". He also contributes to Leela Chess Zero, the main open-source effort to improve on AlphaZero in chess.
- Title: Small steps and giant leaps: Minimal Newton solvers for deep learning (Dr. Joao Henriques - University of Oxford)
Abstract: Despite many exciting advances, modern deep learning is still powered by modest first-order optimisers. The unique combination of large-scale and non-convex problems conspires to render ineffective many decades of research into more sophisticated optimisers, such as Newton methods. In this talk, I will provide an overview of these problems, and how to overcome them to create a minimal, very practical second-order optimiser for large-scale deep learning, inspired by Newton methods. Unlike first-order methods, it is possible to find the optimal hyper-parameters by closed-form solutions, so in practice there is no need to try different learning rates.
I will showcase results of the resulting method, called CurveBall, on several settings. This includes large-scale ImageNet training, ResNet and VGG-f networks, as well as the analysis of small, interpretable problems. The generality of the optimiser is also demonstrated on large sets of random architectures.
Bio: Dr. Joao Henriques is a Research Fellow of the Royal Academy of Engineering, working at the Visual Geometry Group (VGG) at the University of Oxford. He created the KCF and SiameseFC visual object trackers, which are widely deployed in robotics and consumer hardware, and won two times the highly competitive VOT Challenge. His research focuses on optimization for deep learning, robotics, and meta-learning.

Sponsors
Deep Learning With Sets & Minimal Newton Solvers For Deep Learning