PhD Proposal by Daniel Zink

Event Details
  • Date/Time:
    • Monday September 12, 2016
      1:00 pm - 3:00 pm
  • Location: ISyE 226A
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: A Reduction Framework for Extended Formulations and a Faster Algorithm for Online Convex Optimization

Full Summary: No summary paragraph submitted.

Ph.D. Thesis Proposal Announcement

Title: A Reduction Framework for Extended Formulations and a Faster Algorithm for Online Convex Optimization

Daniel Zink
Ph.D. Student
Algorithms, Combinatorics, and Optimization
H. Milton Stewart School of Industrial & Systems Engineering
Georgia Institute of Technology

Date: Monday, September 12th, 2016
Time: 1pm
Location: ISyE 226A

Committee
----------------
Dr. Sebastian Pokutta (Advisor, School of Industrial & Systems Engineering, Georgia Institute of Technology)

Dr. Santosh S. Vempala (School of Computer Science, Georgia Institute of Technology)
Dr. Santanu S. Dey (School of Industrial & Systems Engineering, Georgia Institute of Technology)

Abstract
-----------------------------------------------------------

Extended formulations gained a lot of attention recently due to a result showing that the TSP polytope does not allow an extended formulation of subexponential size. In the first part I will present a new encoding independent view on extended formulations enabling a reduction framework that transforms lower bounds on the size of approximate extended formulations between different problems. We were able to recover inapproximability results previously shown on a case by case basis and show new lower bounds for many problems, e.g., VertexCover, Max-3-SAT and bounded degree IndependentSet. This is joint work Gábor Braun and Sebastian Pokutta.
 
With applications like prediction from expert advise, online shortest path and portfolio selection the online optimization model is popular both from a theoretical and an applications perspective. In the case of online convex optimization online versions of conditional gradient descent (aka Frank-Wolfe) are heavily used due to their advantage of employing a linear optimization oracle as opposed to a computationally very expensive projection step as in Online Gradient Descent. In this second part I will show an algorithm using a linear augmentation oracle instead of the linear optimization oracle while maintaining the same asymptotic regret bounds. Augmentation oracles are usually much faster than their optimization counter parts leading to a significant speed improvement for this type of online algorithm. This is joint work with Gábor Braun and Sebastian Pokutta.

-----------------------------------------------------------

 

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Faculty/Staff
Categories
Other/Miscellaneous
Keywords
Phd proposal
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Draft
  • Created On: Aug 31, 2016 - 9:46am
  • Last Updated: Oct 7, 2016 - 10:19pm