{"70316":{"#nid":"70316","#data":{"type":"event","title":"First-order Methods for Smooth Convex Optimization with Inexact Oracle: Classical, Fast and Intermediate Gradient Methods","body":[{"value":"\u003Cp\u003ETITLE: First-order Methods for Smooth Convex Optimization with Inexact Oracle: Classical, Fast and Intermediate Gradient Methods\u003C\/p\u003E\u003Cp\u003ESPEAKER:\u0026nbsp; Olivier Devolder,\u0026nbsp;CORE, Universit\u00e9 catholique de Louvain\u003C\/p\u003E\u003Cp\u003E\u003Cbr \/\u003E\u003C\/p\u003E\u003Cp\u003EABSTRACT:\u003C\/p\u003E\u003Cp\u003EIn this talk, we analyze the effect on first-order methods of smooth convex optimization (classical and fast gradient methods) if only inexact first-order\ninformation is available. We introduce a new notion of approximate first-order oracle.\u003C\/p\u003E\u003Cp\u003E\n\nFor each method, we develop complexity results and study the link between the desired accuracy on the objective function and the needed accuracy for the\noracle. We obtain that in this inexact case, the superiority of the fast gradient method over the classical one is no more so clear. The optimal scheme suffers from an\naccumulation of errors contrarily to the classical gradient method and the choice between these two kinds of methods depends on the complexity of the computation of\nthe inexact first-order informations.\n\n\u003C\/p\u003E\u003Cp\u003EWe prove that this phenomenon of error accumulation is an intrinsic property of any first-order method with optimal convergence rate. More precisely, we show that there is a clear link between the rate of convergence of a first-order method and the lowest possible rate of errors accumulation that we can expect.\n\n\u003C\/p\u003E\u003Cp\u003EMotivated by this result, we develop a whole family of first-order methods with intermediate rates of convergence and intermediate rates of error accumulation between the classical gradient and the fast gradient methods. We show how these new intermediate first-order methods can be used in order to accelerate the minimization of a smooth convex function when only inexact first-order information is available.\n\u003C\/p\u003E\u003Cp\u003E\nWe present applications of our results to the smoothing technique, the augmented Lagrangian method, the Moreau-Yosida regularization and to non-smooth convex problems.\u003C\/p\u003EThis is joint work with Fran\u00e7ois Glineur and Yurii Nesterov \u003Cp\u003E\u003Cbr \/\u003E\u003C\/p\u003E","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":"","field_summary_sentence":[{"value":"First-order Methods for Smooth Convex Optimization with Inexact Oracle: Classical, Fast and Intermediate Gradient Methods"}],"uid":"27187","created_gmt":"2011-09-28 06:16:27","changed_gmt":"2016-10-08 01:55:54","author":"Anita Race","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2011-11-01T12:00:00-04:00","event_time_end":"2011-11-01T13:00:00-04:00","event_time_end_last":"2011-11-01T13:00:00-04:00","gmt_time_start":"2011-11-01 16:00:00","gmt_time_end":"2011-11-01 17:00:00","gmt_time_end_last":"2011-11-01 17:00:00","rrule":null,"timezone":"America\/New_York"},"extras":[],"groups":[{"id":"1242","name":"School of Industrial and Systems Engineering (ISYE)"}],"categories":[],"keywords":[],"core_research_areas":[],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}