event

ARC Colloquium: Jun Xu, Georgia Tech, School of Computer Sciecne

Primary tabs

Numerous concentration inequalities and large deviation techniques have been developed to bound the probability for the sum of (often) independent random variables to exceed a given threshold. However, they do not solve the worst-case large deviation problems that arise in the design and evaluation of some network and database algorithms. In this family of problems, these random variables are the functions of many parameters and we need to find the worst probability tail bounds (or the worst large deviation rate function) for all possible parameter settings. Such worst-case large deviation bounds are very hard to obtain because the parameter space underlying the large deviation (tail bound) problem is gigantic.Although given a particular parameter setting, establishing the tail bound is straightforward through the Chernoff technique, enumerating this procedure over the entire parameter space is computationally impossible. Our methodology to overcome this difficulty is a novel combination of convex ordering and (traditional) large deviation theory. To our surprise, we found that, in such systems problems, we are often able to find a parameter configuration that dominates all other configurations by the convex order. Since the exponent function is a convex function, we are able to dominate the moment generating function (MGF) of the sum of these random variables under all other parameter settings by the MGF under the worst-case parameter setting. We can then apply the Chernoff technique to this worst-case MGF to obtain very sharp tail bounds.

Groups

Status

  • Workflow Status:Published
  • Created By:Elizabeth Ndongi
  • Created:11/16/2011
  • Modified By:Fletcher Moore
  • Modified:10/07/2016

Categories

  • No categories were selected.

Keywords

  • No keywords were submitted.