{"685789":{"#nid":"685789","#data":{"type":"event","title":"ISyE Seminar - Luis Nunes Vicente","body":[{"value":"\u003Cp\u003ETitle: Reducing Sample Complexity in Stochastic Derivative-Free Optimization via Tail Bounds and Hypothesis Testing\u003C\/p\u003E\u003Cp\u003EAbstract:\u003C\/p\u003E\u003Cp\u003EWe introduce and analyze new probabilistic strategies for enforcing sufficient decrease conditions in stochastic derivative-free optimization, with the goal of reducing sample complexity and simplifying convergence analysis. First, we develop a new tail bound condition imposed on the estimated reduction in function value, which permits flexible selection of the power used in the sufficient decrease test, q in (1,2]. This approach allows us to reduce the number of samples per iteration from the standard O(delta^{\u22124}) to O(delta^{-2q}), assuming\u0026nbsp;that the noise\u0026nbsp;moment of order q\/(q-1) is bounded. Second, we formulate the sufficient decrease condition as a sequential hypothesis testing problem, in which the algorithm adaptively collects samples until the evidence suffices to accept or reject a candidate step. This test provides statistical guarantees on decision errors and can further reduce the required sample size, particularly in the Gaussian noise setting, where it can approach\u0026nbsp;O(delta^{\u22122-r})\u0026nbsp;when the decrease is of the order of delta^r. We incorporate both techniques into stochastic direct-search and trust-region methods for potentially non-smooth, noisy objective functions, and establish their global convergence rates and properties.\u003C\/p\u003E\u003Cp\u003EBio:\u003C\/p\u003E\u003Cp\u003ELuis Nunes Vicente is the Timothy J. Wilmott \u201980 Endowed Faculty Professor and Chair of Lehigh University\u2019s Department of Industrial and Systems Engineering (ISE). His research interests include Continuous Optimization, Computational Science and Engineering, and Machine Learning and Data Science. He obtained his PhD from Rice University in 1996, under a Fulbright scholarship, receiving from Rice the Ralph Budd Thesis Award. He was one of the three finalists of the 94-96 A. W. Tucker Prize of the Mathematical Optimization Society (MOS). In 2015, he was awarded the Lagrange Prize of SIAM (Society for Industrial and Applied Mathematics) and MOS for the co-authorship of the book \u201cIntroduction to Derivative-Free Optimization, MPS-SIAM Series on Optimization, SIAM, Philadelphia, 2009\u201d. He is a SIAM Fellow (Class of 2024). He was elected chair of the SIAM Activity Group on Optimization for 2023-2025 and President of the Association of Chairs of Operations Research Departments (ACORD) at INFORMS for 2024-2025. He has been chairing Lehigh ISE since August 2018.\u003C\/p\u003E","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EWe introduce and analyze new probabilistic strategies for enforcing sufficient decrease conditions in stochastic derivative-free optimization, with the goal of reducing sample complexity and simplifying convergence analysis. First, we develop a new tail bound condition imposed on the estimated reduction in function value, which permits flexible selection of the power used in the sufficient decrease test, q in (1,2]. This approach allows us to reduce the number of samples per iteration from the standard O(delta^{\u22124}) to O(delta^{-2q}), assuming\u0026nbsp;that the noise\u0026nbsp;moment of order q\/(q-1) is bounded. Second, we formulate the sufficient decrease condition as a sequential hypothesis testing problem, in which the algorithm adaptively collects samples until the evidence suffices to accept or reject a candidate step. This test provides statistical guarantees on decision errors and can further reduce the required sample size, particularly in the Gaussian noise setting, where it can approach\u0026nbsp;O(delta^{\u22122-r})\u0026nbsp;when the decrease is of the order of delta^r. We incorporate both techniques into stochastic direct-search and trust-region methods for potentially non-smooth, noisy objective functions, and establish their global convergence rates and properties.\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Reducing Sample Complexity in Stochastic Derivative-Free Optimization via Tail Bounds and Hypothesis Testing"}],"uid":"36527","created_gmt":"2025-10-17 18:16:36","changed_gmt":"2025-10-17 18:21:02","author":"hulrich6","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2025-11-07T11:00:00-05:00","event_time_end":"2025-11-07T12:00:00-05:00","event_time_end_last":"2025-11-07T12:00:00-05:00","gmt_time_start":"2025-11-07 16:00:00","gmt_time_end":"2025-11-07 17:00:00","gmt_time_end_last":"2025-11-07 17:00:00","rrule":null,"timezone":"America\/New_York"},"location":"Groseclose 402","extras":[],"groups":[{"id":"1242","name":"School of Industrial and Systems Engineering (ISYE)"}],"categories":[],"keywords":[],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1795","name":"Seminar\/Lecture\/Colloquium"}],"invited_audience":[{"id":"78761","name":"Faculty\/Staff"},{"id":"177814","name":"Postdoc"},{"id":"78771","name":"Public"},{"id":"174045","name":"Graduate students"},{"id":"78751","name":"Undergraduate students"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}