event

PhD Defense by Philipp Witte

Primary tabs

Title: Software and Algorithms for Large-Scale Seismic Inverse Problems

 

Philipp A. Witte

Ph.D. Candidate in Computational Science and Engineering

School of Computational Science and Engineering

College of Computing

Georgia Institute of Technology

 

Date: Wednesday, February 19, 2020

Time: 10:00 AM - 12:00 PM (EST)

Location: Coda C1015 ("Vinings")
 

 

Committee:

Dr. Felix J. Herrmann (Advisor, School of Computational Science and Engineering, Georgia Institute of Technology)

Dr. Richard Vuduc (School of Computational Science and Engineering, Georgia Institute of Technology)

Dr. Edmond Chow (School of Computational Science and Engineering, Georgia Institute of Technology)

Dr. Justin Romberg (School of Electrical and Computer Engineering, Georgia Institute of Technology)

Dr. Zhigang Peng (School of Earth and Atmospheric Sciences, Georgia Institute of Technology)

 

Abstract:

Seismic imaging and parameter estimation are an import class of inverse problems with practical relevance in resource exploration, carbon control and monitoring systems for geohazards. The goal of seismic inverse problems is to image subsurface geological structures and estimate physical rock properties such as wave speed or density. Mathematically, this can be achieved by solving an optimization problem in which we minimize the mismatch between numerically modeled data and observed data from a seismic survey. As wave propagation through a medium is described by wave equations, seismic inverse problems involve solving a large number of partial differential equations (PDEs) during numerical optimization using finite difference modeling, making them computationally expensive. Additionally, seismic inverse problems are typically ill-posed, non-convex or ill-conditioned, thus making them challenging from a mathematical standpoint as well. Similar to the field of deep learning, this calls for software that is not only optimized for performance, but also enables geophysical domain specialists to experiment with algorithms in high-level programming languages and using different computing environments, such as high-performance computing (HPC) clusters or the cloud. Furthermore, they call for the adaption of dimensionality reduction techniques and stochastic algorithms to address computational cost from the algorithmic side.

 

This thesis makes three distinct contributions to address computational challenges encountered in seismic inverse problems and to facilitate algorithmic development in this field. Part one introduces a large-scale framework for seismic modeling and inversion based on the paradigm of separation of concerns, which combines a user interface based on domain specific abstractions with a Python package for automatic code generation to solve the underlying PDEs. The modular code structure makes it possible to manage the complexity of a seismic inversion code, while matrix-free linear operators and data containers enable the implementation of algorithms in a fashion that closely resembles the underlying mathematical notation. The second contribution of this thesis is an algorithm for seismic imaging, that addresses its high computational cost and large memory imprint through a combination of on-the-fly Fourier transforms, stochastic sampling techniques and sparsity-promoting optimization. The algorithm combines the best of both time- and frequency-domain inversion, as the memory imprint is independent of the number of modeled time steps, while time-to-frequency conversions avoid the need to solve Helmholtz equations, which involve inverting ill-conditioned matrices. Part three of this thesis introduces a novel approach for adapting the cloud for high-performance computing applications like seismic imaging, which does not rely on a fixed cluster of permanently running virtual machines. Instead, computational resources are automatically started and terminated by the cloud environment during runtime and the workflow takes advantage of cloud-native technologies such as event-driven computations and containerized batch processing. The performance and cost analysis shows that this approach is able to address current shortcomings of the cloud such as inferior resilience, while at the same time reducing operating cost up to an order of magnitude. As such, the workflow provides a strategy for cost effectively running large-scale seismic imaging problems in the cloud and is a viable alternative to conventional HPC clusters.

 

Status

  • Workflow Status:Published
  • Created By:Tatianna Richardson
  • Created:01/28/2020
  • Modified By:Tatianna Richardson
  • Modified:01/28/2020

Categories

Keywords