ARC and Indo-US Virtual Center Seminar: Prasad Raghavendra (UC Berkeley)

Event Details
  • Date/Time:
    • Monday April 27, 2020
      11:30 am - 12:30 pm
  • Location: Virtual via Bluejeans
  • Phone:
  • URL:
  • Email:
  • Fee(s):
  • Extras:
No contact information submitted.

Summary Sentence: List-Decodable Learning via Sum of Squares - Virtual via Bluejeans at 11:30am

Full Summary: No summary paragraph submitted.

Algorithms & Randomness Center (ARC) and Indo-US Virtual Center Seminar

Prasad Raghavendra (UC Berkeley)

Monday, April 27, 2020

Virtual via Bluejeans - 11:30 am


Title:  List-Decodable Learning via Sum of Squares

Abstract:  In the list-decodable learning setup, an overwhelming majority (say a $1-\beta$-fraction) of the input data consists of outliers and the goal of an algorithm is to output a small list $\mathcal{L}$ of hypotheses such that one of them agrees with inliers.   We devise list decodable learning algorithms for the problem of linear regression and subspace recovery using the sum of squares SDP hierarchy.

 1)  In the list-decodable linear regression problem, we are given labelled examples $\{(X_i,y_i)\}_{i \in [N]}$ containing a subset $S$ of $\beta N$ {\it inliers} $\{X_i \}_{i \in S}$ that are drawn i.i.d. from standard Gaussian distribution $N(0,I)$ in $\mathbb{R}^d$, where the corresponding labels $y_i$ are well-approximated by a linear function $\hat{\ell}$.  
 We devise an algorithm that outputs a list $\mathcal{L}$ of linear functions such that there exists some $\ell \in \mathcal{L}$ that is close to $\hat{\ell}$.     This yields the first algorithm for linear regression in a list-decodable setting.  Our results hold for a general distribution of examples whose concentration and anti-concentration properties can be certified by low degree sum-of-squares proofs.

 2) In the subspace recovery problem,  given a dataset where an $\alpha$ fraction (less than half) of the data is distributed uniformly in an unknown $k$ dimensional subspace in $d$ dimensions, and with no additional assumptions on the remaining data, the goal is to recover a succinct list of $O(\frac{1}{\alpha})$ subspaces one of which is close to the original subspace.  We provide the first polynomial time algorithm for the 'list decodable subspace recovery' problem.

Joint work with Morris Yau.


Speaker's Webpage

Videos of recent talks are available at:

Click here to subscribe to the seminar email list:

Additional Information

In Campus Calendar


Invited Audience
Faculty/Staff, Postdoc, Graduate students, Undergraduate students
No keywords were submitted.
  • Created By: Francella Tonge
  • Workflow Status: Published
  • Created On: Apr 22, 2020 - 3:47pm
  • Last Updated: Apr 22, 2020 - 4:10pm