Skip to content

Details

We are happy to welcome Dmitry Mittov (Revolut) for a presentation of the Paper Variational Dropout Sparsifies Deep Neural Networks (Molchanov et al., 2017)

Link to the paper: https://proceedings.mlr.press/v70/molchanov17a.html

Special thanks to HelloFresh for hosting this event!

Speakers:

Dmitry Mittov, Data Scientist @ Revolut

Timetable:

18:45 – doors open / socializing
19:00 – welcome
19:15 – talk
20:15 – Q&A
20:30 – socializing
21:15 – end

Title:
Paper presentation: Variational Dropout Sparsifies Deep Neural Networks (Molchanov et al., 2017)

Abstract (from the paper):
We explore a recently proposed Variational Dropout technique that provided an elegant Bayesian interpretation to Gaussian Dropout. We extend Variational Dropout to the case when dropout rates are unbounded, propose a way to reduce the variance of the gradient estimator and report first experimental results with individual dropout rates per weight. Interestingly, it leads to extremely sparse solutions both in fully-connected and convolutional layers. This effect is similar to automatic relevance determination effect in empirical Bayes but has a number of advantages. We reduce the number of parameters up to 280 times on LeNet architectures and up to 68 times on VGG-like networks with a negligible decrease of accuracy.

Bio:
Dmitry studied mathematics in Russia and worked a few years as a software and data engineer. He then decided to go back to his roots and converted to a data scientist role. For the last 5 years he has lived in Berlin. He focuses on predicting customers behaviour, and has a vast expertise in learning on sparse data. He currently works at Revolut: https://www.revolut.com/

Events in Berlin
Machine Learning
Data Science
Applied Statistics
Programming Languages
Statistical Computing

Members are also interested in