Visiting Lecture Series: Jacob Abernethy, “Learning, Optimization, and the Benefits of Noise”

Primary tabs

In essentially every real-world learning and estimation problem, whether we are dealing with Big Data or Little Data, we must always address the same problem: overfitting. One approach to avoid overfitting is to ensure that your learning algorithm is sufficiently noisy, for example by introducing additional randomness (noise) into the procedure. The idea that you should perturb your data appears in various forms across a number of different fields, including Optimization, Economics, and Statistics. 

In this talk, we will expand upon this idea further, bringing out surprising connections between the pricing of options, stochastic smoothing techniques, and choice models. We will finish by discussing a new result that connects a popular randomized optimization procedure, known as simulated annealing, with the deterministic technique known as “Path-following Interior Point Methods”. This establishes a very strong and surprising relationship between the seminal works of Lovasz & Vempala and Nesterov & Nemirovski.



Jacob Abernethy is an assistant professor in the Electrical Engineering and Computer Science Department at the University of Michigan, Ann Arbor. He received his Ph.D. in Computer Science from the University of California, Berkeley, and was a Simons postdoctoral fellow at the University of Pennsylvania. Abernethy's primary interest is in Machine Learning, and he likes discovering connections between Optimization, Statistics, and Economics.


  • Workflow Status: Published
  • Created By: Devin Young
  • Created: 03/28/2016
  • Modified By: Fletcher Moore
  • Modified: 04/13/2017

Target Audience

No target audience selected.