event

Seminar - Tuo Zhao

Primary tabs

TITLE:  Compute Faster and Learn Better: Machine Learning via Nonconvex Model-based Optimization

ABSTRACT:

Nonconvex optimization naturally arises in many machine learning problems (e.g. sparse learning, matrix factorization, and tensor decomposition). Machine learning researchers exploit various nonconvex formulations to gain modeling flexibility, estimation robustness, adaptivity, and computational scalability. Although classical computational complexity theory has shown that solving nonconvex optimization is generally NP-hard in the worst case, practitioners have proposed numerous heuristic optimization algorithms, which achieve outstanding empirical performance in real-world applications.

To bridge this gap between practice and theory, we propose a new generation of model-based optimization algorithms and theory, which incorporate the statistical thinking into modern optimization. Particularly, when designing practical computational algorithms, we take the underlying statistical models into consideration (e.g. sparsity, low rankness). Our novel algorithms exploit hidden geometric structures behind many nonconvex optimization problems, and can obtain global optima with the desired statistics properties in polynomial time with high probability.

Bio: Tuo Zhao is a PhD student in Department of Computer Science at Johns Hopkins University (http://www.cs.jhu.edu/~tour). His research focuses on high dimensional parametric and semiparametric learning, large-scale optimization, and applications to computational genomics and neuroimaging. He was the core member of the JHU team winning the INDI ADHD 200 global competition on fMRI imaging-based diagnosis classification in 2011. He received Siebel scholarship in 2014 and Baidu’s research fellowship in 2015.

Status

  • Workflow Status:Published
  • Created By:Anita Race
  • Created:02/24/2016
  • Modified By:Fletcher Moore
  • Modified:04/13/2017

Keywords

  • No keywords were submitted.