{"689724":{"#nid":"689724","#data":{"type":"event","title":"PhD Defense by Mingyu Guan","body":[{"value":"\u003Cp\u003E\u003Cstrong\u003ETitle:\u003C\/strong\u003E\u0026nbsp; Scalable and Verifiable Foundations for Graph Learning\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EDate:\u003C\/strong\u003E\u0026nbsp;Thursday, April 23, 2026\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ETime:\u003C\/strong\u003E\u0026nbsp;10:00 AM \u2013 12:00 PM (Eastern Time)\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ELocation (Virtual):\u003C\/strong\u003E\u0026nbsp;\u003Ca href=\u0022https:\/\/nam12.safelinks.protection.outlook.com\/?url=https%3A%2F%2Fgatech.zoom.us%2Fj%2F95898773125\u0026amp;data=05%7C02%7Ctm186%40gtvault.onmicrosoft.com%7Ccf7bf56fac71488a84e208de96bc230a%7C482198bbae7b4b258b7a6d7f32faa083%7C1%7C0%7C639113932021421106%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C\u0026amp;sdata=9MU1dn425ddHogduge8CZqeNdh5IzUolPXE%2FJIpEEWE%3D\u0026amp;reserved=0\u0022\u003Ehttps:\/\/gatech.zoom.us\/j\/95898773125\u003C\/a\u003E\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EMingyu Guan\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EPh.D. Student\u003C\/p\u003E\u003Cp\u003ESchool of Computer\u0026nbsp;Science\u003C\/p\u003E\u003Cp\u003EGeorgia Institute of Technology\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ECommittee Members\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EDr. Taesoo Kim (Advisor), School of Cybersecurity and Privacy, Georgia Institute of Technology\u003C\/p\u003E\u003Cp\u003EDr. Anand Iyer (Co-advisor), School of Computer Science, Georgia Institute of Technology\u003C\/p\u003E\u003Cp\u003EDr. Ada Gavrilovska, School of Computer Science, Georgia Institute of Technology\u003C\/p\u003E\u003Cp\u003EDr. Kexin Rong, School of Computer Science, Georgia Institute of Technology\u003C\/p\u003E\u003Cp\u003EDr. Jay Stokes, Microsoft Research\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EAbstract\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EGraph neural networks (GNNs) are increasingly deployed in high-stakes domains such as fraud detection, traffic prediction, and social network analysis. However, real-world graphs are heterogeneous, dynamic, and privacy-critical, posing challenges in structural complexity, scalability, and verifiability across the model lifecycle. This thesis argues that graph learning problems contain inherent structural regularities that, when explicitly leveraged in model and system design, enable scalable and verifiable graph learning across critical stages of the model lifecycle. \u003Cem\u003EFirst\u003C\/em\u003E, we present \u003Cem\u003EHetTree\u003C\/em\u003E, a scalable heterogeneous GNN that exploits the natural tree hierarchy among metapaths by constructing a semantic tree representation and introducing a subtree attention mechanism to capture hierarchical relationships with low computation and memory overhead. \u003Cem\u003ESecond\u003C\/em\u003E, we present \u003Cem\u003EReD\u003C\/em\u003E, a system for scalable dynamic GNN training that leverages independence across snapshot sequences to enable sequence-parallel training without cross-machine communication, supported by sequence-first mini-batching and a two-level cache store. \u003Cem\u003EThird\u003C\/em\u003E, we present \u003Cem\u003ETAITEE\u003C\/em\u003E, a system that establishes verifiable training provenance by integrating training recording with Confidential Computing, recording comprehensive provenance via standard training APIs and generating cryptographically signed training certificates from Trusted Execution Environments with minimal overhead. Together, the three contributions span the critical graph learning lifecycle: what to compute, how to compute it efficiently, and whether the training was conducted as claimed \u2014\u0026nbsp;providing scalable and verifiable foundations for deploying graph learning models in practice.\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EScalable and Verifiable Foundations for Graph Learning\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Scalable and Verifiable Foundations for Graph Learning"}],"uid":"27707","created_gmt":"2026-04-13 18:33:07","changed_gmt":"2026-04-13 18:33:37","author":"Tatianna Richardson","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2026-04-23T10:00:00-04:00","event_time_end":"2026-04-23T12:00:00-04:00","event_time_end_last":"2026-04-23T12:00:00-04:00","gmt_time_start":"2026-04-23 14:00:00","gmt_time_end":"2026-04-23 16:00:00","gmt_time_end_last":"2026-04-23 16:00:00","rrule":null,"timezone":"America\/New_York"},"location":"VIRTUAL","extras":[],"groups":[{"id":"221981","name":"Graduate Studies"}],"categories":[],"keywords":[{"id":"100811","name":"Phd Defense"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1788","name":"Other\/Miscellaneous"}],"invited_audience":[{"id":"78771","name":"Public"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}