DOS Seminar

Primary tabs

TITLE:  Lower Complexity Bounds for Large-Scale Smooth Convex Optimization

SPEAKER:  Cristobal Guzman


We prove lower bounds on the black-box oracle complexity of large-scale
smooth convex minimization problems. These lower bounds work for unit balls
of normed spaces under a technical smoothing condition, and arbitrary
smoothness parameter of the objective with respect to this norm. As a
consequence, we show a unified framework for the complexity of convex
optimization on \ell^p-balls, for 2<=p<=\infty. In particular, we prove
that the T-step Conditional Gradient algorithm as applied to minimizing
smooth convex functions over the n-dimensional box with T<=n is nearly

On the other hand, we prove lower bounds for the complexity of convex
optimization over \ell^p-balls, for 1<=p<2, by combining a random subspace
method with the p=\infty lower bounds. In particular, we establish the
complexity of problem classes that contain the sparse recovery problem.

This is joint work with Arkadi Nemirovski


  • Workflow Status: Published
  • Created By: Anita Race
  • Created: 04/14/2014
  • Modified By: Fletcher Moore
  • Modified: 04/13/2017


No keywords were submitted.