{"684329":{"#nid":"684329","#data":{"type":"news","title":"The Algorithm Will See You Now \u2014 But Only If You\u2019re the Perfect Patient","body":[{"value":"\u003Cp\u003EIn the morning, before you even open your eyes, your wearable device has already checked your vitals. By the time you brush your teeth, it has scanned your sleep patterns, flagged a slight irregularity, and adjusted your health plan. As you take your first sip of coffee, it\u2019s already predicted your risks for the week ahead.\u003C\/p\u003E\u003Cp\u003EGeorgia Tech researchers warn that this version of AI healthcare imagines a patient who is \u0022affluent, able-bodied, tech-savvy, and always available.\u0022 Those who don\u2019t fit that mold, they argue, risk becoming invisible in the healthcare system.\u003C\/p\u003E\u003Ch3\u003E\u003Cstrong\u003EThe Ideal Future\u003C\/strong\u003E\u003C\/h3\u003E\u003Cp\u003EIn their study, published in the \u003Ca href=\u0022https:\/\/dl.acm.org\/doi\/10.1145\/3706598.3713118\u0022\u003E\u003Cem\u003EProceedings of the ACM Conference on Human Factors in Computing Systems\u003C\/em\u003E\u003C\/a\u003E, the researchers analyzed 21 AI-driven health tools,\u0026nbsp;ranging from fertility apps and wearable devices to diagnostic platforms and chatbots. They used sociological theory to understand the vision of the future these tools promote \u2014 and the patients they leave out.\u003C\/p\u003E\u003Cp\u003E\u201cThese systems envision care that is seamless, automatic, and always on,\u201d said \u003Ca href=\u0022https:\/\/www.linkedin.com\/in\/catherine-wieczorek-729a3890\/\u0022\u003ECatherine\u0026nbsp;Wieczorek\u003C\/a\u003E, a Ph.D. student in human-centered computing in the \u003Ca href=\u0022https:\/\/www.ic.gatech.edu\/\u0022\u003ESchool of Interactive Computing\u003C\/a\u003E and lead author of the study. \u201cBut they also flatten the messy realities of illness, disability, and socioeconomic complexity.\u201d\u003C\/p\u003E\u003Ch3\u003E\u003Cstrong\u003EFour Futures, One Narrow Lens\u003C\/strong\u003E\u003C\/h3\u003E\u003Cp\u003EDuring their analysis, the researchers discovered four recurring narratives in AI-powered healthcare:\u003C\/p\u003E\u003Col\u003E\u003Cli\u003E\u003Cstrong\u003ECare that never sleeps.\u003C\/strong\u003E Devices track your heart rate, glucose levels, and fertility signals \u2014 all in real time. You are always being watched, because that\u2019s framed as \u201ccare.\u201d\u003C\/li\u003E\u003Cli\u003E\u003Cstrong\u003EEfficiency as empathy.\u003C\/strong\u003E AI is faster, more objective, and more accurate. Unlike humans, it doesn\u2019t get tired or biased. This pitch downplays the value of human judgment and connection.\u003C\/li\u003E\u003Cli\u003E\u003Cstrong\u003EPrevention as perfection.\u003C\/strong\u003E A world where illness is avoided through early detection if you have the right sensors, the right app, and the right lifestyle.\u003C\/li\u003E\u003Cli\u003E\u003Cstrong\u003EThe optimized body.\u003C\/strong\u003E You\u2019re not just healthy, you\u2019re high-performing. The tech isn\u2019t just treating you; it\u2019s upgrading you.\u003C\/li\u003E\u003C\/ol\u003E\u003Cp\u003E\u201cIt\u2019s like healthcare is becoming a productivity tool,\u201d Wieczorek said. \u201cYou\u2019re not just a patient anymore. You\u2019re a project.\u201d\u003C\/p\u003E\u003Ch3\u003E\u003Cstrong\u003ENot Just a Tool, But a Teammate\u003C\/strong\u003E\u003C\/h3\u003E\u003Cp\u003EThis study also points to a critical transformation in which AI is no longer just a diagnostic tool; it\u2019s a decision-maker. Described by the researchers as \u201cboth an agent and a gatekeeper,\u201d AI now plays an active role in how care is delivered.\u003C\/p\u003E\u003Cp\u003EIn some cases, AI systems are even named and personified, like \u003Ca href=\u0022https:\/\/fairtility.com\/chloe\/\u0022\u003EChloe\u003C\/a\u003E, an IVF decision-support tool. \u201cChloe equips clinicians with the power of AI to work better and faster,\u201d its promotional materials state. By framing AI this way \u2014 as a collaborator rather than just software \u2014 these systems subtly redefine who, or what, gets to be treated.\u003Cbr\u003E\u003Cbr\u003E\u201cWhen you give AI names, personalities, or decision-making roles, you\u2019re doing more than programming. You\u2019re shifting accountability and agency. That has consequences,\u201d said \u003Ca href=\u0022https:\/\/www.ic.gatech.edu\/people\/shaowen-bardzell\u0022\u003EShaowen Bardzell\u003C\/a\u003E, chair of Georgia Tech\u2019s School of Interactive Computing and co-author of the study.\u003C\/p\u003E\u003Cp\u003E\u201cIt blurs the boundaries,\u201d Wieczorek noted. \u201cWhen AI takes on these roles, it\u2019s reshaping how decisions are made and who holds authority in care.\u201d\u003C\/p\u003E\u003Ch3\u003E\u003Cstrong\u003ECalculated Care\u003C\/strong\u003E\u003C\/h3\u003E\u003Cp\u003EMany AI tools promise early detection, hyper-efficiency, and optimized outcomes. But the study found that these systems risk sidelining patients with chronic illness, disabilities, or complex medical needs \u2014 the very people who rely most on healthcare.\u003C\/p\u003E\u003Cp\u003E\u201cThese technologies are selling worldviews,\u201d Wieczorek explained. \u201cThey\u2019re quietly defining who healthcare is for, and who it isn\u2019t.\u201d\u003C\/p\u003E\u003Cp\u003EBy prioritizing predictive algorithms and automation, AI can strip away the context and humanity that real-world care requires.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u201cAlgorithms don\u2019t see nuance. It\u2019s difficult for a model to understand how a patient might be juggling multiple diagnoses or understand what it means to manage illness, while also navigating other important concerns like financial insecurity or caregiving. They are predetermined inputs and outputs,\u201d Wieczorek said. \u201cWhile these systems claim to streamline care, they are also encoding assumptions about who matters and how care should work. And when those assumptions go unchallenged, the most vulnerable patients are often the ones left out.\u201d\u0026nbsp;\u003C\/p\u003E\u003Ch3\u003E\u003Cstrong\u003EAI for ALL\u003C\/strong\u003E\u003C\/h3\u003E\u003Cp\u003EThe researchers argue that future AI systems must be developed in collaboration with those who don\u2019t fit in the vision of a \u201cperfect patient.\u201d\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u201cInnovation without ethics risks reinforcing existing inequalities. It\u2019s about better tech \u003Cem\u003Eand\u0026nbsp;\u003C\/em\u003Ebetter outcomes for real people,\u201d Bardzell said. \u201cWe\u2019re not anti-innovation. But technological progress isn\u2019t just about what we can do. It\u2019s about what we \u003Cem\u003Eshould\u003C\/em\u003E do \u2014 and for whom.\u201d\u003C\/p\u003E\u003Cp\u003EWieczorek and Bardzell aren\u2019t trying to stop AI from entering healthcare. They\u2019re asking AI developers to understand who they\u2019re really serving.\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EFunding:\u003Cbr\u003E\u003Cem\u003EThis work was supported by the National Science Foundation (Grant #2418059\u003C\/em\u003E).\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EGeorgia Tech researchers are warning that the future of AI-driven healthcare may look sleek and seamless \u2014 but only for those who fit the mold of an \u201cideal patient.\u201d They found that apps and algorithms consistently imagine users as affluent, able-bodied, and tech-savvy, while sidelining patients with chronic illness, disabilities, or complex lives. These systems promise nonstop monitoring, perfect prevention, and optimized bodies \u2014 turning healthcare into a productivity upgrade rather than a lifeline. By giving AI decision-making power, the industry risks shifting authority away from human care and toward algorithms. The researchers argue real innovation isn\u2019t just about efficiency or prediction; it\u2019s about building technologies that serve those most in need, ensuring that progress in healthcare doesn\u2019t leave the most vulnerable patients behind.\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Georgia Tech researchers warn that AI reshapes healthcare around an \u201cideal user,\u201d overlooking people who need medical intervention the most."}],"uid":"36410","created_gmt":"2025-09-02 14:26:43","changed_gmt":"2025-09-11 11:38:08","author":"mazriel3","boilerplate_text":"","field_publication":"","field_article_url":"","location":"Atlanta, GA","dateline":{"date":"2025-09-02T00:00:00-04:00","iso_date":"2025-09-02T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"677874":{"id":"677874","type":"image","title":"AI_Healthcare_1.png","body":"\u003Cp\u003EAn illustration representing a doctor working with an AI-powered health device.\u003C\/p\u003E","created":"1756821332","gmt_created":"2025-09-02 13:55:32","changed":"1756822519","gmt_changed":"2025-09-02 14:15:19","alt":"A doctor on a computer working with an AI-powered health device","file":{"fid":"261825","name":"AI_Healthcare_1.png","image_path":"\/sites\/default\/files\/2025\/09\/02\/AI_Healthcare_1.png","image_full_path":"http:\/\/hg.gatech.edu\/\/sites\/default\/files\/2025\/09\/02\/AI_Healthcare_1.png","mime":"image\/png","size":2099622,"path_740":"http:\/\/hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2025\/09\/02\/AI_Healthcare_1.png?itok=bkzB_jGF"}}},"media_ids":["677874"],"groups":[{"id":"1188","name":"Research Horizons"}],"categories":[{"id":"194606","name":"Artificial Intelligence"}],"keywords":[{"id":"187915","name":"go-researchnews"}],"core_research_areas":[{"id":"193655","name":"Artificial Intelligence at Georgia Tech"}],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EMichelle Azriel, Sr. Writer-Editor\u003C\/p\u003E","format":"limited_html"}],"email":["mazriel3@gatech.edu"],"slides":[],"orientation":[],"userdata":""}}}