ISyE Statistic Seminar - Andrew Brown
Low Rank Independence Samplers in Hierarchical Bayesian Inverse Problems
School of Mathematical and Statistical Sciences
In Bayesian inverse problems, the posterior distribution is used to quantify uncertainty about the reconstructed solution. In fully Bayesian approaches in which prior parameters are assigned hyperpriors, Markov chain Monte Carlo (MCMC) algorithms often are used to approximate samples from the posterior. However, implementations of such algorithms can be computationally expensive. In this talk, I will present a computationally efficient scheme for sampling high-dimensional Gaussian distributions in ill-posed Bayesian linear inverse problems. The approach uses Metropolis-Hastings independence sampling with a proposal distribution based on a low-rank approximation of the priorpreconditioned
Hessian. I will present results obtained when using the proposed approach with
Metropolis-Hastings-within-Gibbs sampling in numerical experiments in image deblurring and computerized tomography. Time permitting, I will also briefly discuss applying the low-approximation idea in marginalization-based MCMC algorithms to improve the mixing behavior when compared to standard block Gibbs sampling.
Andrew Brown earned his B.S. in Applied Mathematics from Georgia Tech in 2006. After briefly working for Porsche Cars North America in Atlanta, he went to the University of Georgia to earn his MS and PhD in Statistics under the direction of Nicole Lazar and Gauri Datta. He then joined the School of Mathematical and Statistical Sciences at Clemson University, where he is currently an Assistant Professor. He spent Spring of 2016 as a Visiting Research Fellow at the Statistical and Applied Mathematical Sciences Institute. His research interests include high-dimensional Bayesian modeling and computation, neuroimaging data analysis (particularly functional and structural MRI), computer
experiments, and uncertainty quantification.