event

ISyE Seminar- Elad Hazan

Primary tabs

TITLE:   Efficient Second-order Optimization for Machine Learning

ABSTRACT: 

Stochastic gradient-based methods are the state-of-the-art in large-scale machine learning optimization due to their extremely efficient per-iteration computational cost. Second-order methods, that use the second derivative of the optimization objective, are known to enable faster convergence. However, the latter have been much less explored due to the high cost of computing the second-order information. We will present second-order stochastic methods for (convex and non-convex) optimization problems arising in machine learning that match the per-iteration cost of gradient descent, yet enjoy the faster convergence properties of second-order optimization. 

Joint work with Naman Agarwal and Brian Bullins (ICML '16), and Agarwal, Bullins, Allen-Zhu and Ma (STOC '17)

 

 

BIO: 

Elad Hazan is a professor of computer science at Princeton university. He joined in 2015 from the Technion, where he had been an associate professor of operations research. His research focuses on the design and analysis of algorithms for fundamental problems in machine learning and optimization. Amongst his contributions are the co-development of the AdaGrad algorithm for training learning machines, and the first sublinear-time algorithms for convex optimization. He is the recipient of (twice) the IBM Goldberg best paper award in 2012 for contributions to sublinear time algorithms for machine learning, and in 2008 for decision making under uncertainty, a European Research Council grant , a Marie Curie fellowship and Google Research Award (twice), and winner of the Bell Labs Prize. He serves on the steering committee of the Association for Computational Learning and has been program chair for COLT 2015.

Status

  • Workflow Status:Published
  • Created By:nhendricks6
  • Created:08/24/2017
  • Modified By:nhendricks6
  • Modified:08/24/2017

Categories

  • No categories were selected.

Keywords

  • No keywords were submitted.