Doorgaan naar de inhoud

Details

Bayesian Statistical Uncertainty Quantification: Inference and Prediction with Computational Models and Big Data

Statistical uncertainty quantification is a modern area of statistics concerned with modeling a wide variety of complex real-world processes that exhibit challenging features, such as high-dimensionality, non-linear response behavior, non-stationarity and ``big data’’.

These problem features have lead to many innovations in flexible statistical models, methodology for statistical computation and efficient use of parallel computing for statistical inference and prediction. They can be broadly categorized into two problem types: (i) problems where a well-founded theoretical model is postulated but data collection is expensive or sparse; (ii) problems where a theoretical model is unavailable but huge quantities of observational data can be collected.

In this talk I will introduce some popular statistical models used in both areas, including Gaussian process models and Bayesian Additive Regression Trees (BART), and motivate their application with real-world examples.

Matthew Pratola, Assistant Professor of Statistics, Ohio State University

His research focuses on the development of modern statistical methodology for inferential and predictive applications. This includes methodological research in combining computationally expensive simulation models with observational data and research in non-parametric Bayesian regression tree models for "big data'.'

Matthew's work is generally developed within the Bayesian paradigm and is often motivated by environmental, environmental health, climate or engineering applications.

Gerelateerde onderwerpen

Misschien vind je dit ook leuk