event

ISyE Seminar- Cong Ma

Primary tabs

Title: Nonconvex Optimization Meets Statistics: Towards Rigorous Computational and Inferential Guarantees


Abstract: In recent years, there has been an explosion of interest in designing fast nonconvex optimization algorithms to solve statistical estimation and learning problems. However, in contrast to convex optimization that has become a real pillar of modern engineering, the theoretical foundations of nonconvex optimization are far from satisfactory, especially in terms of its computational and inferential properties. This talk will present two recent stories that advance our understanding of nonconvex statistical estimation. The first story focuses on computational efficiency in solving random quadratic systems of equations. Despite the nonconvexity of the natural least-squares formulation, gradient descent with random initialization finds its global solution within a logarithmic number of iterations. The second story is concerned with uncertainty quantification for nonconvex low-rank matrix completion. We develop a de-biased estimator — on the basis of a nonconvex estimator — that enables optimal construction of confidence intervals for the missing entries of the unknown matrix. All of this is achieved via an integrated view of statistics and optimization.

Bio: Cong Ma is currently a Ph.D. student in the Department of Operations Research and Financial Engineering at Princeton University, advised by Yuxin Chen and Jianqing Fan. His research interests include nonconvex optimization, high-dimensional statistics, machine learning as well as their applications to computational neuroscience. 

Status

  • Workflow Status:Published
  • Created By:sbryantturner3
  • Created:02/07/2020
  • Modified By:sbryantturner3
  • Modified:02/11/2020

Keywords

  • No keywords were submitted.