{"677852":{"#nid":"677852","#data":{"type":"event","title":"ISYE Statistics Seminar - Dr. Rina Foygel Barber","body":[{"value":"\u003Cdiv\u003E\u003Cstrong\u003EAbstract:\u0026nbsp;\u003C\/strong\u003E\u003C\/div\u003E\u003Cdiv\u003EAlgorithmic stability for regression and classification\u003Cbr\u003E\u003Cbr\u003EIn a supervised learning setting, a model fitting algorithm is unstable if small perturbations to the input (the training data) can often lead to large perturbations in the output (say, predictions returned by the fitted model). Algorithmic stability is a desirable property with many important implications such as generalization and robustness, but testing the stability property empirically is known to be impossible in the setting of complex black-box models. In this work, we establish that bagging any black-box regression algorithm automatically ensures that stability holds, with no assumptions on the algorithm or the data. Furthermore, we construct a new framework for defining stability in the context of classification, and show that using bagging to estimate our uncertainty about the output label will again allow stability guarantees for any black-box model. This work is joint with Jake Soloff and Rebecca Willett.\u003C\/div\u003E\u003Cdiv\u003E\u0026nbsp;\u003C\/div\u003E\u003Cdiv\u003E\u0026nbsp;\u003C\/div\u003E\u003Cdiv\u003E\u0026nbsp;\u003C\/div\u003E\u003Cdiv\u003E\u0026nbsp;\u003C\/div\u003E\u003Cdiv\u003E\u003Cstrong\u003EBio:\u0026nbsp;\u003C\/strong\u003E\u003C\/div\u003E\u003Cdiv\u003E\u0026nbsp;\u003C\/div\u003E\u003Cdiv\u003EProfessor in the \u003Ca href=\u0022http:\/\/www.stat.uchicago.edu\/\u0022\u003EDepartment of Statistics at the University of Chicago\u003C\/a\u003E. Before starting at U of C, I was a NSF postdoctoral fellow during 2012-13 in the \u003Ca href=\u0022http:\/\/www-stat.stanford.edu\/\u0022\u003EDepartment of Statistics at Stanford University\u003C\/a\u003E, supervised by \u003Ca href=\u0022http:\/\/www-stat.stanford.edu\/~candes\/\u0022\u003EEmmanuel Cand\u00e8s\u003C\/a\u003E. I received my PhD in Statistics at the University of Chicago in 2012, advised by \u003Ca href=\u0022https:\/\/www.math.cit.tum.de\/statistics\/personen\/mathias-drton\/\u0022\u003EMathias Drton\u003C\/a\u003E and \u003Ca href=\u0022http:\/\/ttic.uchicago.edu\/~nati\/\u0022\u003ENati Srebro\u003C\/a\u003E, and a MS in Mathematics at the University of Chicago in 2009. Prior to graduate school, I was a mathematics teacher at the \u003Ca href=\u0022http:\/\/www.parkschool.net\/academics\/upper-school\/program-of-studies\/mathematics\/\u0022\u003EPark School of Baltimore\u003C\/a\u003E from 2005 to 2007, and received an ScB in Mathematics from Brown University in 2005.\u003C\/div\u003E\u003Cp\u003E\u003Cbr\u003E\u0026nbsp;\u003C\/p\u003E","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EIn a supervised learning setting, a model fitting algorithm is unstable if small perturbations to the input (the training data) can often lead to large perturbations in the output (say, predictions returned by the fitted model). Algorithmic stability is a desirable property with many important implications such as generalization and robustness, but testing the stability property empirically is known to be impossible in the setting of complex black-box models. In this work, we establish that bagging any black-box regression algorithm automatically ensures that stability holds, with no assumptions on the algorithm or the data. Furthermore, we construct a new framework for defining stability in the context of classification, and show that using bagging to estimate our uncertainty about the output label will again allow stability guarantees for any black-box model. This work is joint with Jake Soloff and Rebecca Willett.\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Algorithmic stability for regression and classification"}],"uid":"36433","created_gmt":"2024-10-23 13:59:36","changed_gmt":"2024-10-28 13:49:59","author":"mrussell89","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2024-10-29T14:00:00-04:00","event_time_end":"2024-10-29T15:00:00-04:00","event_time_end_last":"2024-10-29T15:00:00-04:00","gmt_time_start":"2024-10-29 18:00:00","gmt_time_end":"2024-10-29 19:00:00","gmt_time_end_last":"2024-10-29 19:00:00","rrule":null,"timezone":"America\/New_York"},"location":"Groseclose 402","extras":[],"groups":[{"id":"1242","name":"School of Industrial and Systems Engineering (ISYE)"}],"categories":[],"keywords":[],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1795","name":"Seminar\/Lecture\/Colloquium"}],"invited_audience":[{"id":"78761","name":"Faculty\/Staff"},{"id":"177814","name":"Postdoc"},{"id":"78771","name":"Public"},{"id":"174045","name":"Graduate students"},{"id":"78751","name":"Undergraduate students"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}