{"689256":{"#nid":"689256","#data":{"type":"news","title":"New Study Shows Explainability is a Must for Older Adults to Trust AI","body":[{"value":"\u003Cp\u003EVoice-activated, conversational artificial intelligence (AI) agents must provide clear explanations for their suggestions, or older adults aren\u2019t likely to trust them.\u003C\/p\u003E\u003Cp\u003EThat\u2019s one of the main findings from a study by AI Caring on what older adults expect from explainable AI (XAI).\u003C\/p\u003E\u003Cp\u003E\u003Ca href=\u0022https:\/\/ai-caring.org\/\u0022\u003E\u003Cstrong\u003EAI Caring\u003C\/strong\u003E\u003C\/a\u003E is one of three AI Institutions led by Georgia Tech and funded by the National Science Foundation (NSF). The institution supports AI research that benefits older adults and their caregivers.\u003C\/p\u003E\u003Cp\u003ENiharika Mathur, a Ph.D. candidate in the School of Interactive Computing, was the lead author of a paper based on the study. The paper will be presented in April at the \u003Ca href=\u0022https:\/\/chi2026.acm.org\/\u0022\u003E\u003Cstrong\u003E2026 ACM Conference on Human Factors in Computing Systems (CHI) in Barcelona\u003C\/strong\u003E\u003C\/a\u003E.\u003C\/p\u003E\u003Cp\u003EMathur worked with the \u003Ca href=\u0022https:\/\/empowerment.emory.edu\/\u0022\u003E\u003Cstrong\u003ECognitive Empowerment Program at Emory University\u003C\/strong\u003E\u003C\/a\u003E to interview 23 older adults who live alone and use voice-activated AI assistants like Amazon\u2019s Alexa and Google Home.\u003C\/p\u003E\u003Cp\u003EMany of them told her they feel excluded from the design of these products.\u003C\/p\u003E\u003Cp\u003E\u201cThe assumption is that all people want interactions the same way and across all kinds of situations, but that isn\u2019t true,\u201d Mathur said. \u201cHow older people use AI and what they want from it are different from what younger people prefer.\u201d\u003C\/p\u003E\u003Cp\u003EOne example she gave is that young people tend to be informal when talking with AI. Older people, on the other hand, talk to the agent like they would a person.\u003C\/p\u003E\u003Cp\u003E\u201cIf Older adults are talking to their family members about Alexa, they usually refer to Alexa as \u2018she\u2019 instead of \u2018it,\u2019\u201d Mathur said. \u201cThey tend to humanize these systems a lot more than young people.\u201d\u003C\/p\u003E\u003Ch4\u003E\u003Cstrong\u003EGood Explanations\u003C\/strong\u003E\u003C\/h4\u003E\u003Cp\u003EThe study evaluated AI explanations that drew information from four sources of data:\u003C\/p\u003E\u003Cul\u003E\u003Cli\u003EUser history (past conversations with the agent)\u003C\/li\u003E\u003Cli\u003EEnvironmental data (indoor temperature or the weather forecast)\u003C\/li\u003E\u003Cli\u003EActivity data (how much time a user spends in different areas of the home)\u003C\/li\u003E\u003Cli\u003EInternal reasoning (mathematical probabilities and likely outcomes)\u003C\/li\u003E\u003C\/ul\u003E\u003Cp\u003EMathur said older users trust the agent more when it bases its explanations on data from the first three sources. However, internal reasoning creates skepticism.\u003C\/p\u003E\u003Cp\u003EInternal reasoning means the AI doesn\u2019t have enough data from the other sources to give an explanation. It provides a percentage to reflect its confidence based on what it knows.\u003C\/p\u003E\u003Cp\u003E\u201cThe overwhelming response was negative toward confidence scores,\u201d Mathur said. \u201cIf the AI says it\u2019s 92% confident, older adults want to know what that\u2019s based on.\u201d\u003C\/p\u003E\u003Cp\u003EThis is another example that Mathur said points to generational preferences.\u003C\/p\u003E\u003Cp\u003E\u201cThere\u2019s a lot of explainable AI research that shows younger people like to see numbers in explanations, and they also tend to rely too much on explanations that contain numerical confidence. Older adults are the opposite. It makes them trust it less.\u201d\u003C\/p\u003E\u003Ch4\u003E\u003Cstrong\u003EKnowing the Context\u003C\/strong\u003E\u003C\/h4\u003E\u003Cp\u003EMathur said that AI agents interacting with older adults should serve a dual purpose. They should provide users with companionship and support independence while reducing the caretaking burden often placed on family members.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003ESome studies have shown that engineers have tended to favor caretakers in the design of these tools. They prioritize daily tasks and routines, leaving some older adults to feel like they are merely a box to be checked.\u003C\/p\u003E\u003Cp\u003EShe discovered that in urgent situations, older users prefer the AI to be straightforward, while in casual settings, they desire more conversation.\u003C\/p\u003E\u003Cp\u003E\u201cHow people interact with technological systems is grounded in what the stakes of the situation are,\u201d she said. \u201cIf it had anything to do with their immediate sense of safety, they did not want conversational elaboration. They want the AI to be very direct and factual.\u201d\u003C\/p\u003E\u003Ch4\u003E\u003Cstrong\u003ENot Just Checking Boxes\u003C\/strong\u003E\u003C\/h4\u003E\u003Cp\u003EMathur said AI agents that interact with older adults are ideally constructed with a dual purpose. They should provide companionship and autonomy for the users while alleviating the burden of caretaking that is often placed on their family members.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003ESome studies have shown that engineers have strayed toward favoring caretakers in the design of these tools. They prioritize daily tasks and routines, leaving some older adults to feel like they are a box to be checked.\u003C\/p\u003E\u003Cp\u003E\u201cThey\u2019re not being thought of as consumers,\u201d Mathur said. \u201cA lot of products are being made for them but not with them.\u201d\u003C\/p\u003E\u003Cp\u003EShe also said psychological well-being is one of the most important outcomes these tools should produce.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EShowing older adults that they are listened to can significantly help in gaining their trust. Some interviewees told Mathur they want agents who are deliberate about understanding their preferences and don\u2019t dismiss their questions.\u003C\/p\u003E\u003Cp\u003EMeeting these needs reduces the likelihood of protesting and creating conflict with family members.\u003C\/p\u003E\u003Cp\u003E\u201cIt highlights just how important well-designed explanations are,\u201d she said. \u201cWe must go beyond a transparency checklist.\u201d\u003C\/p\u003E","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EAn AI Caring study led by Georgia Tech researchers shows that older adults are more likely to trust conversational AI systems that provide them with clear explanations for their decision-making. The study also shows that including older adults more in the design process benefits their well-being and reduces the caretaking burden of family members\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"A Georgia Tech study finds older adults are more likely to trust voice-activated AI systems when those systems clearly explain how and why they make decisions."}],"uid":"36530","created_gmt":"2026-03-31 14:01:07","changed_gmt":"2026-03-31 14:04:59","author":"Nathan Deen","boilerplate_text":"","field_publication":"","field_article_url":"","location":"Atlanta, GA","dateline":{"date":"2026-03-31T00:00:00-04:00","iso_date":"2026-03-31T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"679796":{"id":"679796","type":"image","title":"0A6A0355.jpg","body":null,"created":"1774965687","gmt_created":"2026-03-31 14:01:27","changed":"1774965687","gmt_changed":"2026-03-31 14:01:27","alt":"An older couple sitting on a couch as a man helps them use Amazon\u0027s Alexa","file":{"fid":"263999","name":"0A6A0355.jpg","image_path":"\/sites\/default\/files\/2026\/03\/31\/0A6A0355.jpg","image_full_path":"http:\/\/hg.gatech.edu\/\/sites\/default\/files\/2026\/03\/31\/0A6A0355.jpg","mime":"image\/jpeg","size":171883,"path_740":"http:\/\/hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2026\/03\/31\/0A6A0355.jpg?itok=t62aVqXD"}}},"media_ids":["679796"],"groups":[{"id":"47223","name":"College of Computing"},{"id":"1188","name":"Research Horizons"},{"id":"50876","name":"School of Interactive Computing"}],"categories":[{"id":"194606","name":"Artificial Intelligence"},{"id":"153","name":"Computer Science\/Information Technology and Security"},{"id":"135","name":"Research"}],"keywords":[{"id":"192863","name":"go-ai"},{"id":"187915","name":"go-researchnews"},{"id":"9153","name":"Research Horizons"},{"id":"193860","name":"Artifical Intelligence"},{"id":"187812","name":"artificial intelligence (AI)"},{"id":"14342","name":"older adults"},{"id":"148721","name":"Amazon Alexa"}],"core_research_areas":[{"id":"193655","name":"Artificial Intelligence at Georgia Tech"},{"id":"39501","name":"People and Technology"}],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}