ISyE Seminar

Primary tabs

TITLE: Convergent subgradient methods for nonsmooth convex minimization

SPEAKER:  Yuri Nesterov


In this talk, we present new subgradient methods for solving nonsmooth
convex optimization problems. These methods are the first ones, for which
the whole sequence of test points is endowed with the worst-case performance
guarantees. The methods are derived from a relaxed estimating sequences
condition, and ensure reconstruction of an approximate primal-dual optimal

Our methods are applicable as efficient real-time stabilization tools for
potential systems with infinite horizon. As an example, we consider a model
of privacy-respecting taxation, where the center has no information on the
utility functions of the agents. Nevertheless, by a proper taxation policy,
the agents can be forced to apply in average the socially optimal
strategies.  Preliminary numerical experiments confirm a high efficiency
of the new methods.

This is a joint work with V.Shikhman (CORE).


  • Workflow Status:
  • Created By:
    Anita Race
  • Created:
  • Modified By:
    Fletcher Moore
  • Modified:


    No keywords were submitted.

Target Audience

    No target audience selected.