What we're about

Deep learning is a rapidly growing field with dozens (editor: hah) of new publications each week on Arxiv. This group is a time set aside to go over interesting research from the previous week. We'll pick a one or a few papers to read and discuss.

Upcoming events (1)

A Simple Baseline for Bayesian Uncertainty in Deep Learning

This session we will discuss "A Simple Baseline for Bayesian Uncertainty in Deep Learning". Everyone should take time to read the paper in detail several days in advance of the meetup, and to the greatest extent possible, read the key references from the papers. Main paper: https://arxiv.org/abs/1902.02476 ----- We propose SWA-Gaussian (SWAG), a simple, scalable, and general purpose approach for uncertainty representation and calibration in deep learning. Stochastic Weight Averaging (SWA), which computes the first moment of stochastic gradient descent (SGD) iterates with a modified learning rate schedule, has recently been shown to improve generalization in deep learning. With SWAG, we fit a Gaussian using the SWA solution as the first moment and a low rank plus diagonal covariance also derived from the SGD iterates, forming an approximate posterior distribution over neural network weights; we then sample from this Gaussian distribution to perform Bayesian model averaging. We empirically find that SWAG approximates the shape of the true posterior, in accordance with results describing the stationary distribution of SGD iterates. Moreover, we demonstrate that SWAG performs well on a wide variety of computer vision tasks, including out of sample detection, calibration, and transfer learning, in comparison to many popular alternatives including MC dropout, KFAC Laplace, and temperature scaling.

Photos (7)