Skip to content

Details

Virtual London Machine Learning Meetup - 06.08.20 @ 18:30

We would like to invite you to our next Virtual Machine Learning Meetup. We are taking the opportunity to change the format slightly and devote more time to Q&A. Please read the papers below and help us create a vibrant discussion.

The discussion will be facilitated by Brandon Amos, a research scientist at Facebook AI Research and studies foundational topics in machine learning, optimization, control, and reinforcement learning. They have a Ph.D. from Carnegie Mellon University with a thesis on "Differentiable Optimization-based Modeling for Machine Learning".

Agenda:

  • 18:25: Virtual doors open
  • 18:30: Talk
  • 19:00: Q&A session
  • 19:35: Close

Sponsors
Evolution AI: Machines that Read - get answers from your text data.

Man AHL: At Man AHL, we mix machine learning, computer science and engineering with terabytes of data to invest billions of dollars every day.

Abstract: Continual lifelong learning requires an agent or model to learn many sequentially ordered tasks, building on previous knowledge without catastrophically forgetting it. Much work has gone towards preventing the default tendency of machine learning models to catastrophically forget, yet virtually all such work involves manually-designed solutions to the problem. We instead advocate meta-learning a solution to catastrophic forgetting, allowing AI to learn to continually learn. Inspired by neuromodulatory processes in the brain, we propose A Neuromodulated Meta-Learning Algorithm (ANML). It differentiates through a sequential learning process to meta-learn an activation-gating function that enables context-dependent selective activation within a deep neural network. Specifically, a neuromodulatory (NM) neural network gates the forward pass of another (otherwise normal) neural network called the prediction learning network (PLN). The NM network also thus indirectly controls selective plasticity (i.e. the backward pass of) the PLN. ANML enables continual learning without catastrophic forgetting at scale: sequentially learning as many as 600 classes (over 9,000 SGD updates).

Bio: Nick Cheney is an Assistant Professor of Computer Science at the University of Vermont, where he directs the UVM Neurobotics Lab, and is a core member of Vermont's Center in Complex Systems and Data Science. Prior to Vermont, Nick received a PhD in Computational Biology from Cornell, co-advised by Hod Lipson and Steve Strogatz, then spent a year at the University of Wyoming with Jeff Clune. He has also served as a visiting researcher at the Santa Fe Institute, NASA Ames, and Columbia University. Nick's research interests span many subfields of machine learning, but he is particularly fascinated by the interactions of coupled and multiscale optimization processes.

Members are also interested in