<node id="588915">
  <nid>588915</nid>
  <type>event</type>
  <uid>
    <user id="27707"><![CDATA[27707]]></user>
  </uid>
  <created>1489695534</created>
  <changed>1489695534</changed>
  <title><![CDATA[PhD Defense by Daniel Zink]]></title>
  <body><![CDATA[<p><strong>Title:</strong> A Reduction Framework for Approximate Extended Formulations<br />
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; and a Faster Algorithm for Convex Optimization<br />
<br />
<strong>Time</strong>: Wednesday, March 29, 10:00am<br />
<br />
<strong>Location:</strong> ISyE Groseclose 402<br />
<br />
<strong>Advisor</strong>: Prof. Sebastian Pokutta<br />
<br />
<strong>Committee:</strong><br />
Prof. Sebastian Pokutta, School of Industrial and Systems Engineering<br />
Prof. Greg Blekherman, School of Mathematics<br />
Prof. Santanu Dey, School of Industrial and Systems Engineering<br />
Prof. George Lan, School of Industrial and Systems Engineering<br />
Prof. Santosh Vempala, School of Computer Science<br />
<br />
<strong>Reader:</strong> Prof. George Lan, School of Industrial and Systems Engineering<br />
<br />
The thesis is available for public inspection in the School of<br />
Mathematics lounge (Skiles 236), the ARC lounge (Klaus 2222), the ISyE<br />
PhD student lounge and at the URL<br />
<br />
<a href="http://aco.gatech.edu/events/final-doctoral-examination-and-defense-dissertation-daniel-zink" target="_blank">http://aco.gatech.edu/events/final-doctoral-examination-and-defense-dissertation-daniel-zink</a><br />
<br />
<br />
<strong>SUMMARY:</strong> Linear programming (LP) and semidefinite programming (SDP) are<br />
among the most important tools in Operations Research and Computer<br />
Science. In this work we study the limitations of LPs and SDPs by<br />
providing lower bounds on the size of (approximate) linear and<br />
semidefinite programming formulations of combinatorial optimization<br />
problems. The hardness of (approximate) linear optimization implied by<br />
these lower bounds motivates the lazification technique for conditional<br />
gradient type algorithms. This technique allows us to replace<br />
(approximate) linear optimization in favor of a much weaker subroutine,<br />
achieving significant performance improvement in practice. We can<br />
summarize the main contributions as follows:<br />
(i) Reduction framework for LPs and SDPs: We present a new view on<br />
extended formulations that does not require an initial encoding of a<br />
combinatorial problem as a linear or semidefinite program. This new view<br />
allows us to define a purely combinatorial reduction framework<br />
transferring lower bounds on the size of exact and approximate LP and<br />
SDP formulations between different problems. Using our framework we show<br />
new LP and SDP lower bounds for a large variety of problems including<br />
Vertex Cover, various (binary and non-binary) constraint satisfaction<br />
problems as well as multiple optimization versions of Graph-Isomorphism.<br />
(ii) Lazification technique for Conditional Gradient algorithms: In<br />
Convex Programming conditional gradient type algorithms (also known as<br />
Frank-Wolfe type methods) are very important in theory as well as in<br />
practice due to their simplicity and fast convergence. We show how we<br />
can eliminate the linear optimization step performed in every iteration<br />
of Frank-Wolfe type methods and instead use a weak separation oracle.<br />
This oracle is significantly faster in practice and enables caching for<br />
additional improvements in speed and the sparsity of the obtained solutions.<br />
&nbsp;</p>
]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[A Reduction Framework for Approximate Extended Formulations and a Faster Algorithm for Convex Optimization]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2017-03-29T11:00:00-04:00]]></value>
      <value2><![CDATA[2017-03-29T13:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>221981</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Graduate Studies]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>100811</tid>
        <value><![CDATA[Phd Defense]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
