January 7, 2013 · 7:00 PM
This location is shown only to members
I have had quite a few people ask me about deep belief networks. These are basically stacked neural networks that are first pre-trained unsupervised for each layer of the network, followed by supervised training to tune your model. Rather than start from a random initialization of the network's weights, it goes through a process to find a "reasonable starting point" for the weights, then proceeds to train them as it normally would until satisfaction is achieved. This was the big breakthrough in 2006 out of Hinton's lab.
I will show you how to use Python and the theano library to build one of these. I will steal most of my material from http://www.deeplearning.net/tutorial/DBN.html with one nice twist--I will show you how to use your own data rather than just MNIST as described in the online tutorial. So you will be able to take this code home and apply it to your own datasets for fun and profit.
Because the networks take an ungodly amount of time to train unless it's running on a powerful GPU (and even then it takes too long) I will be using models that I've trained already for demonstration purposes. The code you write in the workshop will be very slow unless you take it home and run it on GPU. Theano, the python library we will use, does not require a GPU to be present, so you'll be ok on your laptops, at least for testing.
But anyway, if you're curious about getting started in deep learning, you should come to this. Bring a laptop with linux. Desktops won't fit in the meeting space...
Oh, and group news: I got a new laptop to TV hookup, so I won't have to force everyone to watch a powerpoint on my laptop screen while I loom behind them... for anyone that was at the last meetup, you know what i'm talking about... same room, so it might be cramped... free beer though, so it evens out, haha