Intro to Gaussian Processes

Bath Machine Learning Meetup
Bath Machine Learning Meetup
Public group

The Guild Coworking Hub

High Street, Bath · BA1 5AW

How to find us

The furthest entrance from the abbey, opposite TK Maxx

Location image of event venue


Maybe you have heard the phrase "Gaussian process" before. Maybe you're looking to brush up on old knowledge. Or maybe, like me, you remember hearing the name "Gauss" once, but that's more or less the extent of your current understanding.

We are very pleased to welcome Jordan Taylor, PhD student in Statistical Applied Mathematics at the University of Bath, for an evening dedicated to the fascinating topic of Gaussian processes.


Recent state-of-the-art machine learning techniques are heavily dominated by the use of neural networks. These networks can be tuned by first setting a prior over the function space (the number of layers and choice of non-linear activation functions) and by maximising the posterior probability; but how do we quantify the uncertainties of the model predictions?

The Gaussian Process framework yields the predictive distribution by only assuming the data is sufficiently smooth. R. M. Neal's "Bayesian Learning for Neural Networks" has shown that the Gaussian Process is similar to a single-layered neural network with an infinite-dimensional hidden layer.

This talk will provide an introduction to the Gaussian Process building on the standard linear regression framework.