Learning with Explanations & Bayesian Nonparametric Machine Learning


Details
Please note that Photo ID will be required. Please can attendees ensure their meetup profile name includes their full name to ensure entry.
Agenda:
- 18:30: Doors open, pizza, beer, networking
- 19:00: First talk
- 19:45: Break & networking
- 20:00: Second talk
- 20:45: Close
Sponsors
Man AHL: At Man AHL, we mix machine learning, computer science and engineering with terabytes of data to invest billions of dollars every day.
Evolution AI: Build a state-of-the-art NLP pipeline in seconds.
*Learning with Explanations (Tim Rocktäschel)
Abstract: Despite the success of deep learning models in a wide range of applications, these methods suffer from low sample efficiency and opaqueness. Low sample efficiency limits the application of deep learning to domains for which abundant training data exists whereas opaqueness prevents us from understanding how a model derived a particular output, let alone how to correct systematic errors, how to remove bias, or how to incorporate common sense and domain knowledge. To address these issues for knowledge base completion, we developed end-to-end differentiable provers. I will present our recent efforts in applying differentiable provers to statements in natural language texts and large-scale knowledge bases. Furthermore, I will introduce two datasets for advancing the development of models capable of incorporating natural language explanations: eSNLI, crowdsourced explanations for over half a million sentence pairs in the Stanford Natural Language Inference corpus, and ShARC, a conversational question answering dataset with natural language rules.
Bio: Tim Rocktäschel is a Research Scientist at Facebook AI Research London & Lecturer in the Dept of Computer Science at UCL. He was a postdoctoral researcher in the Whiteson Research Lab, a Junior Research Fellow in Computer Science at Jesus College, as well as a Stipendiary Lecturer in Computer Science at Hertford College, at the Uni of Oxford. He obtained his PhD in the Machine Reading group at UCL. Tim is a recipient of a Google PhD Fellowship in Natural Language Processing and a Microsoft Research PhD Scholarship. His work is at the intersection of deep learning, natural language processing, reinforcement learning, program induction, and formal logic.
*Bayesian nonparametric machine learning through randomized loss functions & posterior bootstraps (Chris Holmes)
Abstract: We introduce Bayesian nonparametric learning of parametric models through the use of suitably randomized objective (loss) functions. Bayesian nonparametric posteriors for model parameters and predictive distributions exhibit provably better properties than their conventional Bayesian counterparts when the models are misspecified. For additive loss-functions, inference is achieved through posterior sampling obtained by independent optimizations of randomly re-weighted loss-functions, as opposed to Monte Carlo sampling. This avoids issues with MCMC such as burn-in and chain dependence, and is highly scalable on modern computer architectures allowing for samples to be drawn in parallel for the price of a single optimization. We demonstrate the approach on a number of examples including nonparametric learning for Bayesian logistic regression, variational Bayes (VB), mixture models, and a Bayesian NPL version of random forests.
Bio: Chris Holmes, Prof of Biostatistics, University of Oxford & Scientific Director for Health at the Alan Turing Institute, London. Chris oversees a small research group working on probabilistic models & Bayesian decision theory in complex data environments. This includes theoretical foundations, methodology and “hands-on” study driven data science. Chris holds a joint Statutory Professorship (Oxford speak for Chair) in Biostatistics. Within the Nuffield medical school, he is an Affiliate Member of the Li Ka Shing Centre for Health Information & Discovery. His research is partly funded through a Programme Leaders award from the UK's MRC.

Sponsors
Learning with Explanations & Bayesian Nonparametric Machine Learning