Skip to content

An introduction to INLA with a comparison to JAGS

Photo of Martin Goodson
Hosted By
Martin G. and Francesco B.
An introduction to INLA with a comparison to JAGS

Details

Hi everyone!

Here we announce the June 2015 Meetup which will hosts Gianluca Baio (https://www.ucl.ac.uk/statistics/people/gianlucabaio), Lecturer of Statistics and Health Economics at UCL.

This meetup will be more intense than usual, more technical, and it lasts more. The talk will start at 7:15 and will last about an 1h 45 minutes with a break in between.

Please give your name when RSVP, it's needed to enter Google Campus.

Food, beers and support are kindly offered with the help of G-Research.

http://photos3.meetupstatic.com/photos/event/8/4/f/0/600_437914032.jpeg

Schedule:

6:45 networking, beers and food

7:15 talk - first part

8:00 break

8:15 talk - second part

9:00 end and up

Gianluca Baio's Bio:

Gianluca graduated in Statistics and Economics from the University of Florence (Italy). He then completed a PhD programme in Applied Statistics again at the University of Florence, after a period at the Program on the Pharmaceutical Industry at the MIT Sloan School of Management, Cambridge (USA); he then worked as a Research Fellow and then Temporary Lecturer in the Department of Statistical Sciences at University College London (UK).

Synopsis:

An introduction to INLA with a comparison to JAGS

During the last three decades, Bayesian methods have developed greatly and are now widely established in many research areas, from clinical trials, to health economic assessment, to the social sciences, to epidemiology. The main challenge in Bayesian statistics resides in the computational aspects. Markov Chain Monte Carlo (MCMC) methods are normally used for Bayesian computation, arguably thanks to the wide popularity of the BUGS software. While extremely flexible and able to deal with virtually any type of data and model, in all but trivial cases MCMC methods involve computationally- and time-intensive simulations to obtain the posterior distribution for the parameters. Consequently, the complexity of the model and the database dimension often remain fundamental issues. The Integrated Nested Laplace Approximation (INLA) approach has been recently developed as a computationally efficient alternative to MCMC. INLA is designed for latent Gaussian models, a very wide and flexible class of models ranging from (generalized) linear mixed to spatial and spatio-temporal models. For this reason, INLA can be successfully used in a great variety of applications, also thanks to the availability of an R package named R-INLA. In this talk, we first briefly review the basics of Bayesian computation; then we move on to discuss latent Gaussian models and their computational advantages; finally, we present the fundamental characteristics of the INLA approach. We present a set of worked examples and discuss the modelling assumptions needed, with particular reference to the MCMC counterpart, which we described using JAGS.

Photo of London Machine Learning Meetup group
London Machine Learning Meetup
See more events