ISyE Seminar- Cong Ma

Event Details
  • Date/Time:
    • Tuesday February 18, 2020
      11:00 am - 12:00 pm
  • Location: ISyE Main Room 228
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Nonconvex Optimization Meets Statistics: Towards Rigorous Computational and Inferential Guarantees

Full Summary: Title: Nonconvex Optimization Meets Statistics: Towards Rigorous Computational and Inferential Guarantees
Abstract: In recent years, there has been an explosion of interest in designing fast nonconvex optimization algorithms to solve statistical estimation and learning problems. However, in contrast to convex optimization that has become a real pillar of modern engineering, the theoretical foundations of nonconvex optimization are far from satisfactory, especially in terms of its computational and inferential properties. This talk will present two recent stories that advance our understanding of nonconvex statistical estimation. The first story focuses on computational efficiency in solving random quadratic systems of equations. Despite the nonconvexity of the natural least-squares formulation, gradient descent with random initialization finds its global solution within a logarithmic number of iterations. The second story is concerned with uncertainty quantification for nonconvex low-rank matrix completion. We develop a de-biased estimator — on the basis of a nonconvex estimator — that enables optimal construction of confidence intervals for the missing entries of the unknown matrix. All of this is achieved via an integrated view of statistics and optimization.

Title: Nonconvex Optimization Meets Statistics: Towards Rigorous Computational and Inferential Guarantees


Abstract: In recent years, there has been an explosion of interest in designing fast nonconvex optimization algorithms to solve statistical estimation and learning problems. However, in contrast to convex optimization that has become a real pillar of modern engineering, the theoretical foundations of nonconvex optimization are far from satisfactory, especially in terms of its computational and inferential properties. This talk will present two recent stories that advance our understanding of nonconvex statistical estimation. The first story focuses on computational efficiency in solving random quadratic systems of equations. Despite the nonconvexity of the natural least-squares formulation, gradient descent with random initialization finds its global solution within a logarithmic number of iterations. The second story is concerned with uncertainty quantification for nonconvex low-rank matrix completion. We develop a de-biased estimator — on the basis of a nonconvex estimator — that enables optimal construction of confidence intervals for the missing entries of the unknown matrix. All of this is achieved via an integrated view of statistics and optimization.

Bio: Cong Ma is currently a Ph.D. student in the Department of Operations Research and Financial Engineering at Princeton University, advised by Yuxin Chen and Jianqing Fan. His research interests include nonconvex optimization, high-dimensional statistics, machine learning as well as their applications to computational neuroscience. 

Additional Information

In Campus Calendar
Yes
Groups

School of Industrial and Systems Engineering (ISYE)

Invited Audience
Faculty/Staff, Postdoc, Public, Graduate students, Undergraduate students
Categories
Seminar/Lecture/Colloquium
Keywords
No keywords were submitted.
Status
  • Created By: sbryantturner3
  • Workflow Status: Published
  • Created On: Feb 7, 2020 - 9:22am
  • Last Updated: Feb 11, 2020 - 12:20pm