ISyE Statistic Seminar - Andrew Brown

Event Details
  • Date/Time:
    • Monday April 8, 2019
      2:00 pm - 3:00 pm
  • Location: Groseclose 402
  • Phone:
  • URL: ISyE Building
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Low Rank Independence Samplers in Hierarchical Bayesian Inverse Problems

Full Summary:

Abstract:

In Bayesian inverse problems, the posterior distribution is used to quantify uncertainty about the reconstructed solution. In fully Bayesian approaches in which prior parameters are assigned hyperpriors, Markov chain Monte Carlo (MCMC) algorithms often are used to approximate samples from the posterior. However, implementations of such algorithms can be computationally expensive. In this talk, I will present a computationally efficient scheme for sampling high-dimensional Gaussian
distributions in ill-posed Bayesian linear inverse problems. The approach uses Metropolis-Hastings independence sampling with a proposal distribution based on a low-rank approximation of the priorpreconditioned Hessian. I will present results obtained when using the proposed approach with Metropolis-Hastings-within-Gibbs sampling in numerical experiments in image deblurring and computerized tomography. Time permitting, I will also briefly discuss applying the low-approximation
idea in marginalization-based MCMC algorithms to improve the mixing behavior when compared to standard block Gibbs sampling

Title:

Low Rank Independence Samplers in Hierarchical Bayesian Inverse Problems
Andrew Brown
School of Mathematical and Statistical Sciences
Clemson University

Abstract:

In Bayesian inverse problems, the posterior distribution is used to quantify uncertainty about the reconstructed solution. In fully Bayesian approaches in which prior parameters are assigned hyperpriors, Markov chain Monte Carlo (MCMC) algorithms often are used to approximate samples from the posterior. However, implementations of such algorithms can be computationally expensive. In this talk, I will present a computationally efficient scheme for sampling high-dimensional Gaussian distributions in ill-posed Bayesian linear inverse problems. The approach uses Metropolis-Hastings independence sampling with a proposal distribution based on a low-rank approximation of the priorpreconditioned
Hessian. I will present results obtained when using the proposed approach with
Metropolis-Hastings-within-Gibbs sampling in numerical experiments in image deblurring and computerized tomography. Time permitting, I will also briefly discuss applying the low-approximation idea in marginalization-based MCMC algorithms to improve the mixing behavior when compared to standard block Gibbs sampling.

Bio:

Andrew Brown earned his B.S. in Applied Mathematics from Georgia Tech in 2006. After briefly working for Porsche Cars North America in Atlanta, he went to the University of Georgia to earn his MS and PhD in Statistics under the direction of Nicole Lazar and Gauri Datta. He then joined the School of Mathematical and Statistical Sciences at Clemson University, where he is currently an Assistant Professor. He spent Spring of 2016 as a Visiting Research Fellow at the Statistical and Applied Mathematical Sciences Institute. His research interests include high-dimensional Bayesian modeling and computation, neuroimaging data analysis (particularly functional and structural MRI), computer
experiments, and uncertainty quantification. 

Additional Information

In Campus Calendar
Yes
Groups

H. Milton Stewart School of Industrial and Systems Engineering (ISYE)

Invited Audience
Faculty/Staff, Postdoc, Public, Graduate students, Undergraduate students
Categories
Seminar/Lecture/Colloquium
Keywords
No keywords were submitted.
Status
  • Created By: Julie Smith
  • Workflow Status: Published
  • Created On: Apr 1, 2019 - 2:11pm
  • Last Updated: Apr 1, 2019 - 2:11pm