{"634657":{"#nid":"634657","#data":{"type":"event","title":"ARC and Indo-US Virtual Center Seminar: Prasad Raghavendra (UC Berkeley)","body":[{"value":"\u003Cp align = \u0022center\u0022\u003E\u003Cstrong\u003EAlgorithms \u0026amp; Randomness Center (ARC) and Indo-US Virtual Center Seminar\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp align = \u0022center\u0022\u003E\u003Cstrong\u003EPrasad Raghavendra (UC Berkeley)\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp align = \u0022center\u0022\u003E\u003Cstrong\u003EMonday, April 27, 2020\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp align = \u0022center\u0022\u003E\u003Cstrong\u003EVirtual via Bluejeans - 11:30 am\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003ETitle:\u0026nbsp; \u003C\/strong\u003EList-Decodable Learning via Sum of Squares\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003EAbstract:\u0026nbsp; \u003C\/strong\u003EIn the list-decodable learning setup, an overwhelming majority (say a $1-\\beta$-fraction) of the input data consists of outliers and the goal of an algorithm is to output a small list $\\mathcal{L}$ of hypotheses such that one of them agrees with inliers. \u0026nbsp; We devise list decodable learning algorithms for the problem of linear regression and subspace recovery using the sum of squares SDP hierarchy.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;1)\u0026nbsp; In the list-decodable linear regression problem, we are given labelled examples $\\{(X_i,y_i)\\}_{i \\in [N]}$ containing a subset $S$ of $\\beta N$ {\\it inliers} $\\{X_i \\}_{i \\in S}$ that are drawn i.i.d. from standard Gaussian distribution $N(0,I)$ in $\\mathbb{R}^d$, where the corresponding labels $y_i$ are well-approximated by a linear function $\\hat{\\ell}$. \u0026nbsp;\u003Cbr \/\u003E\r\n\u0026nbsp;\u003Cbr \/\u003E\r\n\u0026nbsp;We devise an algorithm that outputs a list $\\mathcal{L}$ of linear functions such that there exists some $\\ell \\in \\mathcal{L}$ that is close to $\\hat{\\ell}$.\u0026nbsp;\u0026nbsp;\u0026nbsp; \u0026nbsp;This yields the first algorithm for linear regression in a list-decodable setting.\u0026nbsp; Our results hold for a general distribution of examples whose concentration and anti-concentration properties can be certified by low degree sum-of-squares proofs.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;2) In the subspace recovery problem,\u0026nbsp; given a dataset where an $\\alpha$ fraction (less than half) of the data is distributed uniformly in an unknown $k$ dimensional subspace in $d$ dimensions, and with no additional assumptions on the remaining data, the goal is to recover a succinct list of $O(\\frac{1}{\\alpha})$ subspaces one of which is close to the original subspace.\u0026nbsp; We provide the first polynomial time algorithm for the \u0026#39;list decodable subspace recovery\u0026#39; problem.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EJoint work with Morris Yau.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E----------------------------------\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Ca href=\u0022https:\/\/people.eecs.berkeley.edu\/~prasad\/\u0022\u003ESpeaker\u0026#39;s Webpage\u003C\/a\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cem\u003EVideos of recent talks are available at: \u003C\/em\u003E\u003Ca href=\u0022https:\/\/smartech.gatech.edu\/handle\/1853\/46836\u0022\u003E\u003Cem\u003Ehttps:\/\/smartech.gatech.edu\/handle\/1853\/46836\u003C\/em\u003E\u003C\/a\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Ca href=\u0022https:\/\/mailman.cc.gatech.edu\/mailman\/listinfo\/arc-colloq\u0022\u003E\u003Cem\u003EClick here to subscribe to the seminar email list: arc-colloq@Klauscc.gatech.edu \u003C\/em\u003E\u003C\/a\u003E\u003C\/p\u003E\r\n","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":"","field_summary_sentence":[{"value":"List-Decodable Learning via Sum of Squares - Virtual via Bluejeans at 11:30am"}],"uid":"27544","created_gmt":"2020-04-22 19:47:53","changed_gmt":"2020-04-22 20:10:40","author":"Francella Tonge","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2020-04-27T12:30:00-04:00","event_time_end":"2020-04-27T13:30:00-04:00","event_time_end_last":"2020-04-27T13:30:00-04:00","gmt_time_start":"2020-04-27 16:30:00","gmt_time_end":"2020-04-27 17:30:00","gmt_time_end_last":"2020-04-27 17:30:00","rrule":null,"timezone":"America\/New_York"},"extras":[],"groups":[{"id":"70263","name":"ARC"}],"categories":[],"keywords":[],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1795","name":"Seminar\/Lecture\/Colloquium"}],"invited_audience":[{"id":"78761","name":"Faculty\/Staff"},{"id":"177814","name":"Postdoc"},{"id":"174045","name":"Graduate students"},{"id":"78751","name":"Undergraduate students"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}