event

PhD Defense by Daniel Zink

Primary tabs

Title: A Reduction Framework for Approximate Extended Formulations
          and a Faster Algorithm for Convex Optimization

Time: Wednesday, March 29, 10:00am

Location: ISyE Groseclose 402

Advisor: Prof. Sebastian Pokutta

Committee:
Prof. Sebastian Pokutta, School of Industrial and Systems Engineering
Prof. Greg Blekherman, School of Mathematics
Prof. Santanu Dey, School of Industrial and Systems Engineering
Prof. George Lan, School of Industrial and Systems Engineering
Prof. Santosh Vempala, School of Computer Science

Reader: Prof. George Lan, School of Industrial and Systems Engineering

The thesis is available for public inspection in the School of
Mathematics lounge (Skiles 236), the ARC lounge (Klaus 2222), the ISyE
PhD student lounge and at the URL

http://aco.gatech.edu/events/final-doctoral-examination-and-defense-dissertation-daniel-zink


SUMMARY: Linear programming (LP) and semidefinite programming (SDP) are
among the most important tools in Operations Research and Computer
Science. In this work we study the limitations of LPs and SDPs by
providing lower bounds on the size of (approximate) linear and
semidefinite programming formulations of combinatorial optimization
problems. The hardness of (approximate) linear optimization implied by
these lower bounds motivates the lazification technique for conditional
gradient type algorithms. This technique allows us to replace
(approximate) linear optimization in favor of a much weaker subroutine,
achieving significant performance improvement in practice. We can
summarize the main contributions as follows:
(i) Reduction framework for LPs and SDPs: We present a new view on
extended formulations that does not require an initial encoding of a
combinatorial problem as a linear or semidefinite program. This new view
allows us to define a purely combinatorial reduction framework
transferring lower bounds on the size of exact and approximate LP and
SDP formulations between different problems. Using our framework we show
new LP and SDP lower bounds for a large variety of problems including
Vertex Cover, various (binary and non-binary) constraint satisfaction
problems as well as multiple optimization versions of Graph-Isomorphism.
(ii) Lazification technique for Conditional Gradient algorithms: In
Convex Programming conditional gradient type algorithms (also known as
Frank-Wolfe type methods) are very important in theory as well as in
practice due to their simplicity and fast convergence. We show how we
can eliminate the linear optimization step performed in every iteration
of Frank-Wolfe type methods and instead use a weak separation oracle.
This oracle is significantly faster in practice and enables caching for
additional improvements in speed and the sparsity of the obtained solutions.
 

Status

  • Workflow Status:Published
  • Created By:Tatianna Richardson
  • Created:03/16/2017
  • Modified By:Tatianna Richardson
  • Modified:03/16/2017

Categories

Keywords

Target Audience