{"584419":{"#nid":"584419","#data":{"type":"event","title":"PhD Proposal by Bo Xie","body":[{"value":"\u003Cp\u003E\u003Cstrong\u003ETitle : Algorithms and Analysis for Non-convex Problems in Machine Learning\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003EBo Xie\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003ESchool of Computational Science and Engineering\u003C\/p\u003E\r\n\r\n\u003Cp\u003ECollege of Computing\u003C\/p\u003E\r\n\r\n\u003Cp\u003EGeorgia Institute of Technology\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003EDate : Friday, December 02, 2016\u003C\/p\u003E\r\n\r\n\u003Cp\u003ETime : 10:30 AM to 12:30 PM EST\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003ELocation : KACB 1315\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003ECommittee\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E-------------\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003EDr. Le Song (Advisor), School of Computational Science and Engineering\u003C\/p\u003E\r\n\r\n\u003Cp\u003EDr. Santosh Vempala, School of Computer Science\u003C\/p\u003E\r\n\r\n\u003Cp\u003EDr. Hongyuan Zha, School of Computational Science and Engineering\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003EAbstract\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E-------------\u003C\/p\u003E\r\n\r\n\u003Cp\u003ENon-convex problems are ubiquitous in machine learning, ranging from hidden variable models in graphical model estimation, to matrix factorization in recommender systems, and to recently hugely successful deep learning models. Compared with convex problems which have provably efficient algorithms, non-convex problems are in general much harder to solve and most existing algorithms do not have theoretical guarantees on their performance.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn this thesis, we propose efficient algorithms and provide novel analysis for many of the important non-convex problems in machine learning. For hidden variable models, we leverage the special tensor structure and design a nonlinear spectral algorithm that avoids the intractability of non-convex optimization. To scale up such nonlinear spectral decomposition, we have also proposed a doubly stochastic gradient algorithm as well as a distributed algorithm that can efficiently run on millions of data points. Moreover, we provide formal analysis on the optimization landscape of one-hidden-layer neural network where any critical point implies a global minimum when the neural weights are diverse enough.\u003C\/p\u003E\r\n","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":"","field_summary_sentence":[{"value":"Algorithms and Analysis for Non-convex Problems in Machine Learning"}],"uid":"27707","created_gmt":"2016-11-30 16:29:35","changed_gmt":"2016-11-30 16:30:19","author":"Tatianna Richardson","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2016-12-02T10:30:00-05:00","event_time_end":"2016-12-02T12:30:00-05:00","event_time_end_last":"2016-12-02T12:30:00-05:00","gmt_time_start":"2016-12-02 15:30:00","gmt_time_end":"2016-12-02 17:30:00","gmt_time_end_last":"2016-12-02 17:30:00","rrule":null,"timezone":"America\/New_York"},"extras":[],"groups":[{"id":"221981","name":"Graduate Studies"}],"categories":[],"keywords":[{"id":"102851","name":"Phd proposal"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1788","name":"Other\/Miscellaneous"}],"invited_audience":[{"id":"78771","name":"Public"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}