MCMC sampling has allowed for simulation from complex, massively multivariate distributions even when that distribution is only specified up to an unknown normalizing constant. This talk presents the theory behind maximum likelihood parameter estimation using simulation (MCMC-MLE), and little known but vitally important implementation issues and heuristics focusing on exponential families of distributions. In particular, a useful approximation to the normalizing constant in the the exponential family likelihood is presented which increases the stability and accuracy of the MCMC-MLE algorithm. We also show how MCMC standard errors can be used as a measure of when to trust this approximation. Finally, simple examples are used to illustrate how this algorithm can be used to perform inference on a new class of models for social networks dubbed Exponential-Family Random Network Models (ERNM).
Dr. Ian Fellows is the founder an president of Fellow Statistics Inc., a statistical consulting firm providing advanced analytic support to the corporate and government worlds. His research interests range over many sub-disciplines of statistics, including statistical visualization where his work won the John Chambers Award in 2011, artificial intelligence where his Texas Hold'em AI programs were ranked second in the world [masked]), and Markov Chain Monte Carlo methods which were used extensively in his research in social network theory. For more information, see http://www.fellstat.com.
There is no validated parking. You can find free parking at: 1036 Broxton Ave Los Angeles, CA 90024. Also please check in at the front desk before going upstairs.