{"681732":{"#nid":"681732","#data":{"type":"event","title":"PhD Defene by Yinghao Li","body":[{"value":"\u003Cp\u003E\u003Cstrong\u003ETitle: Uncertainty-Aware and Data-Efficient Fine-Tuning and Application of Foundation Models\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EDate: April 18th, 2025\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ETime: 1.00 PM \u2014 2.30 PM, EDT\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003ELocation: CODA C1103 Lindberg; also on Zoom: \u003Ca href=\u0022https:\/\/gatech.zoom.us\/j\/94101453773\u0022\u003Ehttps:\/\/gatech.zoom.us\/j\/94101453773\u003C\/a\u003E\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EYinghao Li\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EMachine Learning PhD Student\u003C\/p\u003E\u003Cp\u003ESchool of Electrical and Computer Engineering\u003Cbr\u003EGeorgia Institute of Technology\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ECommittee\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003E1. Dr. Chao Zhang (CSE, Advisor)\u003C\/p\u003E\u003Cp\u003E2. Dr. Rampi Ramprasad (MSE, Co-Advisor)\u003C\/p\u003E\u003Cp\u003E3. Dr. Tuo Zhao (ISYE)\u003C\/p\u003E\u003Cp\u003E4. Dr. Srijan Kumar (CSE \u0026amp; Lighthouz AI)\u003C\/p\u003E\u003Cp\u003E5. Dr. Victor Fung (CSE)\u003C\/p\u003E\u003Cp\u003E6. Dr. Ali Torkamani (Amazon Web Services)\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EAbstract\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EPre\u2011trained foundation models power modern natural language processing and scientific workflows, yet their deployment is hindered by training-inference distribution shifts, scarce in\u2011domain labels, model hallucination, and poorly calibrated model confidence. This thesis tackles these obstacles along two fronts: 1)\u202freliable uncertainty quantification (UQ) and 2)\u202fdata\u2011efficient model learning. For reliability, we establish MUBen, a best\u2011practice UQ benchmark for molecular property prediction, and propose UQAC, which backtracks attention chains to approximate the intractable marginalization over the reasoning space, yielding better calibrated answer probabilities from large language models (LLMs). To boost data efficiency, we introduce CHMM and its sparse variant for weakly supervised named entity recognition, and devise G\u0026amp;O, a zero\u2011shot information extraction framework that harnesses LLM reasoning. We further present ELREA, a fine\u2011tuning strategy that clusters input instructions by gradient direction for training task\u2011specific LoRA experts and inference-time expert ensembling, improving model generalization capability without relying on additional data points. Together, these contributions enhance the trustworthiness, robustness, and adaptability of foundation models in high\u2011stakes real\u2011world settings.\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003Esee below\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Uncertainty-Aware and Data-Efficient Fine-Tuning and Application of Foundation Models"}],"uid":"27707","created_gmt":"2025-04-10 20:00:12","changed_gmt":"2025-04-10 20:00:58","author":"Tatianna Richardson","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2025-04-18T13:00:00-04:00","event_time_end":"2025-04-18T15:00:00-04:00","event_time_end_last":"2025-04-18T15:00:00-04:00","gmt_time_start":"2025-04-18 17:00:00","gmt_time_end":"2025-04-18 19:00:00","gmt_time_end_last":"2025-04-18 19:00:00","rrule":null,"timezone":"America\/New_York"},"location":"CODA C1103 Lindberg; also on Zoom: ","extras":[],"groups":[{"id":"221981","name":"Graduate Studies"}],"categories":[],"keywords":[{"id":"100811","name":"Phd Defense"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1788","name":"Other\/Miscellaneous"}],"invited_audience":[{"id":"78771","name":"Public"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}