event
Foley Award Winner Presentations
Primary tabs
It's an annual tradition for Foley Scholar winners to present their research. Niharika Mathur, Mohsin Yousufi, Rachel Lowy, and Joon Kum will present their research projects:
Creating Space for Choice: LLM Supported Decision-Making in Inclusive Higher Education
Abstract: People with intellectual and developmental disabilities (IDD) are often excluded from meaningful decision-making, leading to negative consequences for self-determination and quality of life. Research shows that interventions supporting active, informed decision-making can build causal agency and improve quality of life. In partnership with students, peer tutors, and advising staff at Georgia Tech's Inclusive Post-Secondary Education (IPSE) program, we designed and deployed MyChoice, an LLM-integrated tool that facilitated decision-making and assignment creation for IPSE students. Our design process involved formative interviews with IPSE stakeholders and participatory co-design with IPSE students, culminating in a semester-long deployment in Georgia Tech’s IPSE program. This talk explores our design and deployment, surfacing opportunities for LLM-integrated systems to support personalized, strengths-based decision-making for students with IDD.
Bio: Rachel Lowy is a PhD candidate in Human-Centered Computing at Georgia Tech and is co-leading an OMCS seminar on LLMs. Her research specializing in the design of accessible and inclusive educational technologies for neurodivergent individuals. Rachel uses participatory and co-design methods, utilizing skills from her clinical background as a Speech Therapist to design inclusive and accessible design experiences. Her current project explores Large Language Models to support learners with intellectual and developmental disabilities in Inclusive Post-Secondary Higher Education programs. Her previous work on VR design for inclusive work and on inclusive design education have been featured at CHI and CSCW conferences.
- - - - - - - - - -
Epistemic Breakdowns: A Theory of Friction in Civic Sociotechnical Systems
Abstract: Community-Based Participatory Research (CBPR) is a powerful framework for centering the lived experiences, priorities, and constraints of marginalized and stigmatized communities who are disproportionally affected by sociotechnical harms. CBPR demands a shift from traditional researcher‑driven approaches toward long‑term, equitable partnerships in which communities engage in all processes. Effectively "doing" CBPR involves a substantial time commitment, often requiring researchers to spend years building the rapport and trust necessary for a mutually beneficial relationship. This collaborative process frequently encounters tensions rooted in mismatches between academic expectations, research ideals, institutional requirements, and community needs. Drawing on three case studies from vulnerable groups, I illustrate the evolving interpersonal, ethical, and sociotechnical challenges researchers encounter and highlight opportunities to co‑create practices that empower underserved populations to design systems for the future.
Bio: Mohsin Yousufi is a civic technologist and currently a PhD candidate in Digital Media at Georgia Tech, working with Dr. Yanni Loukissas. He is also a research engineer at Public AI, where he researches the intersection of civic technology and collective intelligence systems. Mohsin's research focuses on developing innovative knowledge infrastructures that empower communities to pursue effective collective action and self-governance. Trained as an architect in Karachi, Pakistan, he was previously a researcher at Harvard’s Berkman Klein Center, metaLAB, and a Fulbright scholar at the University of Illinois. His work has been presented at both academic and public venues, including the Smithsonian, ACM CSCW, DIS, and NeurIPS.
- - - - - - - - - -
Human-Centered Explainable AI for Aging in Place
Abstract: As AI systems become embedded in everyday life, from smart homes to health support, they increasingly act with more autonomy and often without clearly communicating why. This lack of human-understandable explanations from AI systems can lead to confusion, breakdowns in interaction, and eventually, reduced trust and adoption, particularly from non-technical users. In this talk, I present a human-centered approach to designing AI explanations that support understanding, confidence and actionability in real-world settings. Drawing on years of research with older adults aging in place, I examine how people interpret, question and respond to AI-driven decisions in everyday contexts. I introduce a framework that characterizes explanations based on the data they draw from (such as conversational history, environmental context, task knowledge and system confidence) and show how these influence user perceptions of trust and usefulness. Through both controlled studies and in-situ evaluations of conversational AI systems, this work highlights the importance of designing AI explanations as interactive and context-aware mechanisms that shape how people live with and rely on AI over time.
Bio: Niharika Mathur is a PhD candidate in Human-Centered Computing at Georgia Tech, where she is advised by Sonia Chernova and Elizabeth Mynatt. Her research focuses on designing human-centered explainable AI systems that support understanding and long-term AI use in everyday contexts. She studies how conversational AI systems can communicate their reasoning in ways that are meaningful to non-expert users, with a particular focus on older adults aging in place. Her work has been recognized with a Best Paper Award at ACM ASSETS and has been published at leading venues in HCI including ACM CHI, CSCW, and ASSETS.
- - - - - - - - - -
Virtual Ecological Research Assistant (VERA): Integrating a Metacognitive LLM Agent into an Inquiry-Based Learning Tool for Ecology
Abstract: My research explores the design and development of a pedagogical AI agent within the web-based interactive learning environment, Virtual Ecological Research Assistant (VERA), to support inquiry-based learning through systematic model construction and simulation. In VERA, learners build and simulate conceptual models of ecological systems, investigating complex scientific phenomena. Prior work has focused on cognitive AI coaches that analyze user data to classify user types, predict learning stages, and provide personalized feedback. While these coaches provide insights into user performance, evaluating user models with machine learning approaches did not provide accurate results. Moreover, they often place less emphasis on the broader practices that make science learning meaningful. Inquiry-based learning requires learners to actively engage in scientific practices, integrate crosscutting concepts, and apply domain knowledge through guided exploration and scaffolding. Framing AI as a social agent, my research investigates how pedagogical AI agents can facilitate richer inquiry experiences by supporting students' sensemaking, iterative model refinement, and reflective reasoning. This work is novel in exploring how a metacognitive large language model-based agent can be embedded within a platform like VERA to support inquiry-based science learning. The presentation will share current progress and directions for future work.
Bio: Joon Kum is a second-year master's student in the Human-Computer Interaction program at Georgia Tech. He became interested in how AI can better support teachers and students during his student teaching at a Title I middle school, which led him to pursue graduate study at Georgia Tech. Advised by Dr. Ashok Goel, his research focuses on designing and developing a pedagogical AI agent to support inquiry-based learning in the ecology domain. After graduation, he will begin work as a public school computer science teacher in Gwinnett County, GA, where he wants to bring his knowledge and skills into his future classroom. He ultimately hopes to pursue a Ph.D. in Learning Sciences and build a career as a researcher supporting educators, learners, and public education.
---
IPaT: GVU Lunch Lecture Series
The IPaT: GVU Lunch Lecture Series is free and features guest speakers presenting on topics related to people-centered technologies and their impact on society. Lunch is provided at 12:00 p.m. (while supplies last) and the talks begin at 12:30 p.m. Join us weekly or watch video replays. Most lectures are held in the Centergy One building in Technology Square.
https://research.gatech.edu/ipat/lunch-lectures
Groups
Status
- Workflow status: Published
- Created by: Walter Rich
- Created: 03/27/2026
- Modified By: Walter Rich
- Modified: 03/27/2026
Categories
Keywords
User Data
Target Audience