event

PhD Proposal by Kayla Uleah Evans

Primary tabs

Ignorance by Design: How Institutional Infrastructures Produce and Foreclose Decision Recognition in Interdisciplinary AI Research

 

Kayla Uleah Evans

Ph.D. Student in Human-Centered Computing

School of Interactive Computing

Georgia Institute of Technology

 

Date: Monday, April 6, 2026

Time: 9:00-10:00 AM (public portion) | 9:00 AM-12:00 PM (full)

Locations:  Zoom & TSRB 217B

Zoom instructions: https://gatech.zoom.us/j/93196219384?pwd=1CoV2ENEam6GE8z0H70bIPfkfvFzgi.1

Room instructions: exit elevator, turn left, look to your right, walk toward glass-walled room

 

 

Abstract:

 

Academic AI research has not undergone the infrastructural redesign that other fields developed in response to crises of accountability (e.g., IRBs in medicine, environmental impact assessments in policy, disclosure requirements in finance). For AI research, the primary institutional response has been funding interdisciplinarity: assembling computer scientists with ethicists, social scientists, and domain experts. Yet no empirical measure exists to evaluate whether these collaborations produce genuine knowledge integration, and existing evidence suggests that many instead exhibit subordination-service dynamics, wherein non-technical expertise is consulted only after core decisions have already been made.

 

This dissertation poses a fundamental question: do researchers even recognize when a decision is being made? Drawing on science and technology studies, infraethics, affordance theory, distributed cognition, and agnotology, I introduce decision recognition: an emergent property of collaboration wherein at least two courses of action become simultaneously visible, viable, and voiceable within a team. When institutional infrastructures — funding timelines, publication norms, disciplinary hierarchies — foreclose these conditions, the result is institutionalized ignorance: patterned ignorance produced through ordinary institutional operations rather than deliberate suppression.

 

The proposed work builds on five completed studies across multiple collaborations: theoretical work originating from a 2024 advisory collaboration with NIST's ARIA program; decision-tracing interviews with NLP researchers; an analysis of ambiguous costs across 15 LLM projects; situated ethnography of a complex data audit; and a 14-month research-through-design collaboration with Chayn, a global gender-based violence prevention tech nonprofit. Two additional studies extend this work: structured card-based co-design sessions with researcher dyads to surface perceptual asymmetries between collaborators, and the development of a standalone protocol enabling research teams to conduct infrastructural self-examination independently.

 

The dissertation offers three contributions: a conceptual vocabulary for analyzing pre-deliberative scoping in collaborative AI research; empirical evidence that infrastructure produces systematic perceptual asymmetries even among collaborators on identical projects; and a protocol operationalizing this framework for independent use within research teams. This work reframes responsible AI from evaluating decisions already rendered to designing conditions wherein decisions become recognizable.

 

 

Committee Members:

Dr. Betsy DiSalvo (Advisor) is a professor in Georgia Tech’s School of Interactive Computing and director of the Culture and Technology Lab. Her work examines informal learning, computing education, and the social dimensions of technology.

 

Dr. Christopher J. MacLellan is an associate professor in Georgia Tech’s School of Interactive Computing. His research spans human-computer interaction, AI, and machine learning.

 

Dr. André Brock is an associate professor in Georgia Tech’s School of Literature, Media, and Communication. His scholarship focuses on race, digital media, Black culture online, and critical studies of technology.

 

Dr. Ding Wang is a senior researcher at Google Research and affiliated with Georgia Tech’s School of Interactive Computing. Her work explores human-computer interaction, AI systems, and the labor and infrastructure behind data-driven technologies.

 

Dr. Cassidy Sugimoto is the Tom and Marie Patton Professor and School Chair in Georgia Tech’s Jimmy and Rosalynn Carter School of Public Policy. Her research examines how knowledge is produced, shared, and evaluated, with particular attention to equity and the scientific workforce.

 

Dr. Ben Waber is President & Co-Founder of Humanyze and a researcher affiliated with MIT Media Lab and Ritsumeikan University. His work centers on organizational behavior, people analytics, and the dynamics of collaboration in workplaces.

 

Link to Proposal Document

Status

  • Workflow status: Published
  • Created by: Tatianna Richardson
  • Created: 04/02/2026
  • Modified By: Tatianna Richardson
  • Modified: 04/02/2026

Categories

Keywords

User Data

Target Audience