{"625118":{"#nid":"625118","#data":{"type":"event","title":"TRIAD Lecture Series by Yuxin Chen from Princeton (2\/5)","body":[{"value":"\u003Cp\u003EThis is one of a series of talks that are given by Professor Chen. The full list of his talks is as follows:\u003Cbr \/\u003E\r\nWednesday, August 28, 2019; 11:00 am - 12:00 pm; Groseclose 402\u003Cbr \/\u003E\r\nThursday, August 29, 2019; 11:00 am - 12:00 pm; Groseclose 402\u003Cbr \/\u003E\r\nTuesday, September 3, 2019; 11:00 am - 12:00 pm; Main - Executive Education Room 228\u003Cbr \/\u003E\r\nWednesday, September 4, 2019; 11:00 am - 12:00 pm; Main - Executive Education Room 228\u003Cbr \/\u003E\r\nThursday, September 5, 2019; 11:00 am - 12:00 pm; Groseclose 402\u003C\/p\u003E\r\n\r\n\u003Cp\u003ECheck https:\/\/triad.gatech.edu\/events for more information.\u0026nbsp;\u003Cbr \/\u003E\r\nFor location information, please check https:\/\/isye.gatech.edu\/about\/maps-directions\/isye-building-complex\u003C\/p\u003E\r\n\r\n\u003Cp\u003ETitle of this talk: Random initialization and implicit regularization in nonconvex statistical estimation\u003C\/p\u003E\r\n\r\n\u003Cp\u003EAbstract: Recent years have seen a flurry of activities in designing provably efficient nonconvex procedures for solving statistical estimation\/learning problems. Due to the highly nonconvex nature of the empirical loss, state-of-the-art procedures often require suitable initialization and proper regularization (e.g.,\u0026nbsp;trimming, regularized cost, projection) in order to guarantee fast convergence. For vanilla procedures such as\u0026nbsp;gradient descent, however, the prior theory is often either far from optimal or completely lacks theoretical\u003Cbr \/\u003E\r\nguarantees.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThis talk is concerned with a striking phenomenon arising in two nonconvex problems (i.e. phase retrieval and matrix completion): even in the absence of careful initialization, proper saddle escaping, and\/or explicit regularization, gradient descent converges to the optimal solution within a logarithmic number of iterations, thus achieving near-optimal statistical and computational guarantees at once. All of this is achieved by exploiting the statistical models in analyzing optimization algorithms, via a leave-one-out approach that enables the decoupling of certain statistical dependency between the gradient descent iterates and the data. As\u0026nbsp;a byproduct, for noisy matrix completion, we demonstrate that gradient descent achieves near-optimal entrywise\u0026nbsp;error control.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThis is joint work with Cong Ma, Kaizheng Wang, Yuejie Chi, and Jianqing Fan\u003C\/p\u003E\r\n","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EThis is one of a series of talks that are given by Professor Chen. The full list of his talks is as follows:\u003Cbr \/\u003E\r\nWednesday, August 28, 2019; 11:00 am - 12:00 pm; Groseclose 402\u003Cbr \/\u003E\r\nThursday, August 29, 2019; 11:00 am - 12:00 pm; Groseclose 402\u003Cbr \/\u003E\r\nTuesday, September 3, 2019; 11:00 am - 12:00 pm; Main - Executive Education Room 228\u003Cbr \/\u003E\r\nWednesday, September 4, 2019; 11:00 am - 12:00 pm; Main - Executive Education Room 228\u003Cbr \/\u003E\r\nThursday, September 5, 2019; 11:00 am - 12:00 pm; Groseclose 402\u003C\/p\u003E\r\n\r\n\u003Cp\u003ECheck https:\/\/triad.gatech.edu\/events for more information.\u003Cbr \/\u003E\r\n\u0026nbsp;\u003C\/p\u003E\r\n","format":"limited_html"}],"field_summary_sentence":[{"value":"This is one of a series of talks that are given by Professor Chen."}],"uid":"34963","created_gmt":"2019-08-25 17:24:25","changed_gmt":"2019-08-29 12:50:46","author":"Xiaoming Huo","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2019-08-29T12:00:00-04:00","event_time_end":"2019-08-29T13:00:00-04:00","event_time_end_last":"2019-08-29T13:00:00-04:00","gmt_time_start":"2019-08-29 16:00:00","gmt_time_end":"2019-08-29 17:00:00","gmt_time_end_last":"2019-08-29 17:00:00","rrule":null,"timezone":"America\/New_York"},"extras":[],"related_links":[{"url":"http:\/\/www.princeton.edu\/~yc5\/slides\/random_init_slides.pdf","title":"Talk Slides at Speaker\u0027s web site"}],"groups":[{"id":"602673","name":"TRIAD "}],"categories":[],"keywords":[{"id":"92811","name":"data science"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1795","name":"Seminar\/Lecture\/Colloquium"}],"invited_audience":[{"id":"78761","name":"Faculty\/Staff"},{"id":"177814","name":"Postdoc"},{"id":"78771","name":"Public"},{"id":"174045","name":"Graduate students"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}