event

ISyE Seminar

Primary tabs

TITLE: Convergent subgradient methods for nonsmooth convex minimization

SPEAKER:  Yuri Nesterov

ABSTRACT:

In this talk, we present new subgradient methods for solving nonsmooth
convex optimization problems. These methods are the first ones, for which
the whole sequence of test points is endowed with the worst-case performance
guarantees. The methods are derived from a relaxed estimating sequences
condition, and ensure reconstruction of an approximate primal-dual optimal
solutions.

Our methods are applicable as efficient real-time stabilization tools for
potential systems with infinite horizon. As an example, we consider a model
of privacy-respecting taxation, where the center has no information on the
utility functions of the agents. Nevertheless, by a proper taxation policy,
the agents can be forced to apply in average the socially optimal
strategies.  Preliminary numerical experiments confirm a high efficiency
of the new methods.

This is a joint work with V.Shikhman (CORE).

Status

  • Workflow Status:Published
  • Created By:Anita Race
  • Created:04/09/2014
  • Modified By:Fletcher Moore
  • Modified:04/13/2017

Keywords

  • No keywords were submitted.