[Advanced] Michael Clark presents Neural Processes for Timeseries
Details
Neural Processes: what are they and why would you use them?
Mike from Three Springs Technology has been playing with Neural Process models for time-series prediction (and gridding). Lets hurt our brains by trying to understand them and work out why or where we might want to use them.
These novel networks were first introduced in 2018 and are similar to Gaussian Processes in output, have a meta-learning component, and have a latent space similar to variational autoencoders.
Expect a group of frowns staring at a diagram. Expect lots of questions.
From a practical standpoint, you may be interested in this talk if you work with time series and want to estimate uncertainties with your predictions. Especially if they are partly periodic in nature, like currents, tides, or power consumption.
In particular, let us look at how well they do at forecasting power usage data, how good they are at gridding, and how the uncertainty behaves. Especially if we use Monte Carlo Dropout. That way we can decide if they are worth trying on our own problems.
The code and experiments that this talk is based on is here: https://github.com/3springs/attentive-neural-processes and https://github.com/3springs/np_vs_kriging
If you want to do some pre-reading on this topic, here's a gentle introduction: https://kasparmartens.rbind.io/post/np/
See more about our work https://threespringstechnology.com
The slides will be available afterward as a link in the comments below.
-------
Secured building doors close at 6 pm sharp. Please arrive early to ensure entrance.
