Deep Gaussian Processes
To remind ourselves that deep learning is not synonymous with neural networks we have a talk on Deep Gaussian Processes from the University of Sheffield.
To guarantee a place ensure you register with Skills Matter (https://skillsmatter.com/meetups/6429-deep-gaussian-processes) as well.
See you there!
--Dirk & Ali
Deep Gaussian processes (Andreas Damianou)
This talk will discuss a newly introduced family of Bayesian approaches aiming at combining the structural advantages of deep models with the expressive power of Gaussian processes. The obtained deep belief networks are constructed using continuous variables connected with nonparametric mappings; therefore, the methodology used for training and inference deviates from traditional deep learning paradigms.
The first part of the talk will thus outline the computational tools associated with deep Gaussian processes. In the second part, we will discuss specific variants of the model family, such as dynamical / multi-view / dimensionality reduction models and nonparametric autoencoders.
The above concepts and algorithms will be demonstrated with examples from computer vision (e.g. high-dimensional video, images) and robotics (silhoutte/motion capture data).
Bio: Andreas Damianou is finishing his PhD studies in prof. Neil Lawrence's machine learning group based in the University of Sheffield. His research focuses on developing novel, non-parametric deep learning models based on Gaussian processes. The developed methods have been applied to dynamical systems modelling, computer vision, robotics and bioinformatics.