{"60554":{"#nid":"60554","#data":{"type":"event","title":"CSE Seminar: Yoram Singer","body":[{"value":"\u003Cp\u003ECSE Seminar\u003C\/p\u003E\n\n\u003Cp\u003ESpeaker: Yoram Singer\u003C\/p\u003E\n\n\u003Cp\u003EDate: Friday, Sept. 3, 2010\u003C\/p\u003E\n\n\u003Cp\u003ETime: 2-3 p.m.\u003C\/p\u003E\u003Cp\u003ELocation: KACB 1447\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\n\n\u003Ch5\u003ETitle\u003C\/h5\u003E\u003Cp\u003E\u003Cstrong\u003E\u003C\/strong\u003EComposite Objective Optimization and Learning for\nMassive Datasets\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\n\n\n\n\n\n\n\n\n\n\n\n\n\n\u003Ch5\u003EAbstract\u003C\/h5\u003E\u003Cp\u003E\u003Cstrong\u003E\u003C\/strong\u003EComposite objective optimization is concerned with\nthe problem of minimizing a two-term\nobjective function which consists of an empirical loss function and a regularization function. Application with massive\ndatasets often employ a regularization term which is non-differentiable or\nstructured, such as L1 or mixed-norm\nregularization. Such regularizers promote sparse solutions and special structure of the parameters of the\nproblem, which is a desirable goal for datasets of extremely high-dimensions. In this\ntalk, we discuss several recently developed\nmethods for performing composite objective minimization in the online learning and stochastic optimization\nsettings. We start with a description of extensions of the well-known\nforward-backward splitting method to\nstochastic objectives. We then generalize this paradigm to the family of mirror-descent algorithms. Our work builds on\nrecent work which connects proximal minimization to online and stochastic\noptimization. We focus in the algorithmic\npart on a new approach, called AdaGrad, in which the proximal function is adapted throughout the course of the\nalgorithm in a data-dependent manner. This temporal\nadaptation metaphorically allows us to find needles in\nhaystacks as the algorithm is able to single out very predictive yet\nrarely observed features. We conclude with\nseveral experiments on large-scale datasets that\ndemonstrate the merits of composite objective optimization and underscore superior performance of various instantiations of\nAdaGrad.\u003C\/p\u003E\n\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\n\n\n\n\n\n\n\n\u003Ch5\u003EBio\u003C\/h5\u003E\u003Cp\u003EYoram Singer is a senior research scientist at\nGoogle. From 1999 through 2007 he was an\nassociate professor at the Hebrew university of Jerusalem, Israel. He was member of the technical staff at AT\u0026amp;T\nResearch from 1995 through 1999. He served as an associate editor of Machine Learning\nJournal and is now on the editorial board of\nthe Journal of Machine Learning Research and IEEE Signal Processing Magazine. He was the co-chair of COLT\u002704 and\nNIPS\u002707. He is a AAAI Fellow and won for several awards for his research\npapers, most recently the 10 years\nretrospect award for the most influential paper of ICML 2000.\u003C\/p\u003E","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":"","field_summary_sentence":[{"value":"\u0022Composite Objective Optimization and Learning for Massive Datasets\u0022"}],"uid":"27174","created_gmt":"2010-08-24 13:10:46","changed_gmt":"2016-10-08 01:52:11","author":"Mike Terrazas","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2010-09-03T15:00:00-04:00","event_time_end":"2010-09-03T16:00:00-04:00","event_time_end_last":"2010-09-03T16:00:00-04:00","gmt_time_start":"2010-09-03 19:00:00","gmt_time_end":"2010-09-03 20:00:00","gmt_time_end_last":"2010-09-03 20:00:00","rrule":null,"timezone":"America\/New_York"},"extras":[],"groups":[{"id":"47223","name":"College of Computing"},{"id":"50877","name":"School of Computational Science and Engineering"}],"categories":[],"keywords":[],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1795","name":"Seminar\/Lecture\/Colloquium"}],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EFor more information, contact \u003Ca href=\u0022mailto:lebanon@cc.gatech.edu\u0022\u003EGuy Lebanon\u003C\/a\u003E.\u003C\/p\u003E","format":"limited_html"}],"email":[],"slides":[],"orientation":[],"userdata":""}}}