PhD Thesis Defense: Cristobal Guzman

Event Details
  • Date/Time:
    • Monday March 30, 2015
      9:00 am - 11:00 am
  • Location: Academic Conference Room - Groseclose Building - Room 204
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Information, Complexity and Structure in Convex Optimization

Full Summary: No summary paragraph submitted.

Final doctoral examination and defense of dissertation of Cristobal Guzman


Title: Information, Complexity and Structure in Convex Optimization

Time: Monday, March 30, 9:00am
Location: Academic Conference Room - Groseclose Building - Room 204
Advisors: Prof. Arkadi Nemirovski and Prof. Sebastian Pokutta

Committee:
Prof. Arkadi Nemirovski, School of Industrial and Systems Engineering
Prof. Sebastian Pokutta, School of Industrial and Systems Engineering
Prof. Shabbir Ahmed, School of Industrial and Systems Engineering
Prof. Santosh Vempala, School of Computer Science
Prof. Alexandre d'Aspremont, Department d'Informatique, Ecole Normale
Superieure

Reader: Prof. Alexandre d'Aspremont, Department d'Informatique, Ecole
Normale Superieure

SUMMARY: This thesis is focused on the limits of performance of large-scale
convex optimization algorithms. Classical theory of oracle complexity,
first proposed by Nemirovski and Yudin in 1983, successfully established
the worst-case behavior of methods based on local oracles (a generalization
of first-order oracle for smooth functions) for nonsmooth convex
minimization, both in the large-scale and low-scale regimes; and the
complexity of approximately solving linear systems of equations (equivalent
to convex quadratic minimization) over Euclidean balls, under a
matrix-vector multiplication oracle. Our work extends the applicability of
lower bounds in two directions: Worst-Case Complexity of Large-Scale Smooth
Convex Optimization: We generalize lower bounds on the complexity of
first-order methods for convex optimization, considering classes of convex
functions with Holder continuous gradients. Our technique relies on the
existence of a smoothing kernel, which defines a smooth approximation for
any convex function via infimal convolution. As a consequence, we derive
lower bounds for ell_p/ell_q-setups, where 1 <= p,q <= \infty, and extend
to its matrix analogue: Smooth (w.r.t. Schatten q-norm) convex minimization
over matrices with bounded Schatten p-norm. The major consequences of this
result are the near-optimality of the Conditional Gradient method over
box-type domains (p=q=\infty), and the near-optimality of Nesterov's
accelerated method over the cross-polytope (p=q=1).
The thesis is available for public inspection in the School of
Mathematics lounge (Skiles 236), the ARC lounge (Klaus 2222), the ISyE
PhD student lounge, and at the URL

         http://www.aco.gatech.edu/dissert/guzman.html

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Public
Categories
Other/Miscellaneous
Keywords
graduate students, Phd Defense
Status
  • Created By: Danielle Ramirez
  • Workflow Status: Published
  • Created On: Mar 23, 2015 - 12:36pm
  • Last Updated: Oct 7, 2016 - 9:46pm