Bayesian Machine and Deep Learning (part I)

This is a past event

207 people went

Location visible to members

Details

What's this Session About?

Bayesian ML is now widely established as one of the most important foundations for machine and deep learning.

Topics which bridge the gap between Bayesian Machine Learning and Deep Learning will be discussed in some detail.

Agenda (Tentative):

18:00-18:30: Gathering (No food)
18:30 –21:30

Talk 1: Shlomo Kashani (Chief Data Scientist)

"Introduction to Bayesian Machine Learning"

The session is an introduction to Bayesian ML topics such as prior and posterior distributions, likelihood, Bayesian Logistic Regression, Bayesian Bandits and the Beta-Binomial model (A commonly used paradigm in numerous ML models, is to assume that an observed set of binary response variables is generated from a Bernoulli distribution with probabilities varying according to a Beta distribution. )

Focus will however be on how and why to use Bayesian tools for tasks like classification and more. All examples will be illustrated with Python code (PyMC3, Edward and emcee, —probabilistic programming frameworks written in Python) and made available on github.com.

Talk 2: Tal Rozen (Data Scientist, Saips)
"SSD: Single Shot MultiBox Detector"
SSD: First deep network based object detector that does not resample pixels or features for bounding box hypotheses and is as accurate as approaches that do.A single-shot detector for multiple categories that is faster than state of the art single shot detectors (YOLO) and as accurate as Faster R-CNN ▷ Predicts category scores and boxes offset for a fixed set of default BBs using small convolutional filters applied to feature maps.

It was published on Oct 28th, 2016, and I've found it to have high potential replacing existing architectures for faster performance without loosing accuracy.

Talk 3: Natan Katz (Machine Learning Architect, NICE)
"Variational Bayes."

Bios:

Shlomo holds a bachelors degree in Engineering and an M.Sc in Digital Signal Processing (QMUL, London). By day Shlomo works as a Chief data Scientist. By night, he composes music, codes in Python and writes chapters for his forthcoming “Handbook of Data Science Interview Questions”.

Tal

Natan

Important Points:

1. There is no registration fee.

Github code:

https://github.com/QuantScientist/deep-ml-meetups

Background Reading:

Computational Methods in Bayesian Analysis (https://plot.ly/ipython-notebooks/computational-bayesian-analysis/). Chris Fonnesbeck.

Think Bayes (http://greenteapress.com/thinkbayes/). Version 1.0.6. Allen Downey.

Computational Statistics II (https://www.youtube.com/watch?v=heFaYLKVZY4) (code (https://github.com/fonnesbeck/scipy2015_tutorial)). Chris Fonnesbeck. SciPy 2015.

Bayesian Statistical Analysis (https://github.com/fonnesbeck/scipy2014_tutorial). Chris Fonnesbeck. SciPy 2014.

Probabilistic Programming and Bayesian Methods for Hackers (https://github.com/CamDavidsonPilon/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers). Cam Davidson Pilon.

Frequentism and Bayesianism V: Model Selection (https://jakevdp.github.io/blog/2015/08/07/frequentism-and-bayesianism-5-model-selection/). Jake Vanderplas.

Understanding Bayes: A Look at the Likelihood (http://alexanderetz.com/2015/04/15/understanding-bayes-a-look-at-the-likelihood/). Alex Etz.

High-Level Explanation of Variational Inference (https://www.cs.jhu.edu/%7Ejason/tutorials/variational.html). Jason Eisner.

Bayesian Deep Learning (http://twiecki.github.io/blog/2016/06/01/bayesian-deep-learning/). Thomas Wiecki.

A Tutorial on Variational Bayesian Inference (http://www.orchid.ac.uk/eprints/40/1/fox_vbtut.pdf). Charles Fox, Stephen Roberts.

Variational Inference (https://www.cs.princeton.edu/courses/archive/fall11/cos597C/lectures/variational-inference-i.pdf). David M. Blei.

Probabilistic Programming Data Science with PyMC3 (https://youtu.be/LlzVlqVzeD8). Thomas Wiecki.

emcee http://dan.iel.fm/emcee/current/#

PyMC3 https://github.com/pymc-devs/pymc3

Edward http://edwardlib.org/

Bayes Blocks (http://research.ics.aalto.fi/bayes/software/)

( http://research.ics.aalto.fi/bayes/software/ ) is a C++/Python implementation of the variational building block framework. The framework allows easy learning of a wide variety of models using variational Bayesian learning. It is available as free software under the GNU General Public License.

VIBES (http://vibes.sourceforge.net/) ( http://vibes.sourceforge.net/ ) allows variational inference to be performed automatically on a Bayesian network. It is implemented in Java and released under revised BSD license.

PyMC (https://github.com/pymc-devs/pymc)( https://github.com/pymc-devs/pymc ) provides MCMC methods in Python. It is released under the Academic Free License.

Stan (http://mc-stan.org/) ( http://mc-stan.org/ ) provides inference using MCMC with an interface for R and Python. It is released under the New BSD License.

Variational Bayesian Inference: http://people.inf.ethz.ch/bkay/talks/Brodersen_2013_03_22.pdf (http://people.inf.ethz.ch/bkay/talks/Brodersen_2013_03_22.pdf)

Variational Inference in Machine Learning Tutorial : http://shakirm.com/papers/VITutorial.pdf