<nodes> <node id="689973">  <title><![CDATA[Cybersecurity School Takes Home Multiple Awards ]]></title>  <uid>36253</uid>  <body><![CDATA[<p>Seven members of the <a href="https://scp.cc.gatech.edu/">School of Cybersecurity and Privacy</a> (SCP) community were recognized for their leadership and excellence on Monday afternoon at the 35th Annual College of Computing Awards Ceremony.</p><p>“I am pleased to be able to recognize all of this hard work,” said Dean <strong>Vivek Sarkar</strong> during the ceremony.</p><p>One student, two staff members, and four faculty members were nominated by their SCP peers and received awards for their achievements over the past year.&nbsp;</p><h2>Student Solves Real World Problems</h2><p><strong>Yibin Yang</strong> (Ph.D. CS 2025) was awarded a 2025 Dissertation Award for his thesis on zero-knowledge proofs in real-world problems. SCP Professor and Senior Associate Chair <strong>Vlad Kolesnikov&nbsp;</strong>advised Yang and acknowledged that Yang’s work advances the field of cryptography.&nbsp;</p><p>Yang contributed to the advancement of zero-knowledge proofs and multi-party computations, while also building toolchains that are faster and more usable than existing systems. His work earned a <a href="https://www.cc.gatech.edu/news/cryptographic-research-receives-distinguished-paper-award-acm-ccs-23">distinguished paper award</a> at the 2023 ACM CCS, and he also served as an RSAC Security Scholar.</p><h2>Staff Lead the Way</h2><p>In the staff category, <strong>Mary Helen Hayes</strong> was awarded the Outstanding Staff Leadership Award, and <strong>Gina Anderson</strong> received the Ruthie Book Outstanding Staff Team Member Award.</p><p>The Outstanding Staff Leadership Award is given to a full-time administrative staff member in recognition of an outstanding record of leadership that has resulted in a significant positive impact on the College of Computing, the Institute, or the computing community. Hayes was nominated by four faculty and staff members for this award for her steady presence in SCP since she began her role as director of research operations in 2024.&nbsp;</p><p>The Ruthie Book Outstanding Staff Team Member Award is presented to a staff member in recognition of their outstanding performance in honor of Ruthie Book, who exemplified excellence in her work. Anderson was nominated by SCP faculty and staff for her outstanding leadership and mentorship as assistant director of business operations.</p><p>Both received praise for their hard work from the college as well as from their supervisor, Senior Academic Officer <strong>Jan Morian</strong>.&nbsp;</p><p>“I am so incredibly proud of our staff in the School of Cybersecurity and Privacy who won awards this year at the College of Computing Annual Awards ceremony,” she said.</p><p>“Mary Helen Hayes and Regina Anderson are truly outstanding staff members who exemplify Georgia Tech’s values. Their leadership has contributed substantially to the success of the school.”</p><h2>Cybersecurity Faculty Net Four Awards</h2><p>The College of Computing also recognized four SCP faculty members for excellence in teaching and research during the college’s annual award ceremony.&nbsp;</p><p>Assistant Professor <a href="https://scp.cc.gatech.edu/external-news/new-faculty-wants-secure-ai-wild"><strong>Teodora Baluta</strong></a> received the Junior Faculty Teaching Award for developing a new graduate-level course that brought together generative artificial intelligence (AI) security, adversarial machine learning, cryptography, and differential privacy. Her nominator, SCP Associate Professor Vassilis Zikas, said the course bridged a critical gap in a rapidly evolving area of computing.&nbsp;</p><p>For his role in leading <a href="https://team-atlanta.github.io/">Team Atlanta</a> to victory in the <a href="https://www.cc.gatech.edu/news/georgia-tech-makes-history-wins-darpa-challenge">DARPA AI Cyber Challenge</a>, Professor&nbsp;<strong>Taesoo Kim</strong> received the Outstanding Senior Faculty Research Award. His nominator, Regents Professor <strong>Wenke Lee</strong>, praised the team’s performance, which not only won the competition but also beat the combined score of all other competitors. The AI developed by Team Atlanta is now open sourced with the <a href="https://www.cc.gatech.edu/news/competition-community-how-team-atlantas-ai-cybersecurity-breakthrough-going-open-source">Open Source Security Foundation</a>.&nbsp;</p><p>Associate Professor <strong>Frank Li</strong> received the Junior Faculty Research Award for establishing world-class research <a href="https://faculty.cc.gatech.edu/~frankli/beeslab.html">BEES Lab</a> at Georgia Tech. One of his nominators, Associate Professor <strong>Saman Zonouz</strong>, put Li’s name forward for his work empirically evaluating and improving internet security and privacy from an operational standpoint.&nbsp;</p><p>Finally, Associate Professor&nbsp;<strong>Brendan Saltaformaggio</strong> received&nbsp;the Mid-Career Faculty Research Award. Zikas nominated him for establishing internationally recognized research in cybersecurity forensics, malware analysis, AI security, and software supply chain security. Saltaformaggio’s research highlights include the discovery of over <a href="https://www.cc.gatech.edu/news/follow-money-2-billion-crypto-scams-found-ethereum">$2 billion in stolen funds</a> on the Ethereum blockchain.&nbsp;</p><p>"We know SCP faculty conduct highly impactful research that is of the highest quality,” said SCP Interim Chair <strong>Mustaque Ahamad</strong>. “Our faculty receiving research awards at all levels recognizes this and shows how we are working to realize SCP’s vision of creating security for everyone and everything."</p>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1776965425</created>  <gmt_created>2026-04-23 17:30:25</gmt_created>  <changed>1777038858</changed>  <gmt_changed>2026-04-24 13:54:18</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Seven members of the School of Cybersecurity and Privacy (SCP) community were recognized for their leadership and excellence on Monday afternoon at the 35th Annual College of Computing Awards Ceremony.]]></teaser>  <type>news</type>  <sentence><![CDATA[Seven members of the School of Cybersecurity and Privacy (SCP) community were recognized for their leadership and excellence on Monday afternoon at the 35th Annual College of Computing Awards Ceremony.]]></sentence>  <summary><![CDATA[<p>Seven members of the <a href="https://scp.cc.gatech.edu/">School of Cybersecurity and Privacy</a> (SCP) community were recognized for their leadership and excellence on Monday afternoon at the 35th Annual College of Computing Awards Ceremony.</p>]]></summary>  <dateline>2026-04-23T00:00:00-04:00</dateline>  <iso_dateline>2026-04-23T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-04-23 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jpopham3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Popham</p><p>Communications Officer II at the School of Cybersecurity and Privacy</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>680047</item>          <item>680046</item>          <item>680057</item>          <item>680056</item>          <item>680053</item>          <item>680055</item>          <item>680054</item>      </media>  <hg_media>          <item>          <nid>680047</nid>          <type>image</type>          <title><![CDATA[CoC-Awards-Spring-webcopy.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[CoC-Awards-Spring-webcopy.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/23/CoC-Awards-Spring-webcopy.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/23/CoC-Awards-Spring-webcopy.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/23/CoC-Awards-Spring-webcopy.jpg?itok=kXGzObq4]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A photo of a group of people]]></image_alt>                    <created>1776965449</created>          <gmt_created>2026-04-23 17:30:49</gmt_created>          <changed>1776965449</changed>          <gmt_changed>2026-04-23 17:30:49</gmt_changed>      </item>          <item>          <nid>680046</nid>          <type>image</type>          <title><![CDATA[Teodora-CoC-Awards-Spring-2026_MG_0187.jpg]]></title>          <body><![CDATA[<p><em>Assistant Professor Teodora Baluta receiving the Junior Faculty Teaching Award. Photos by Terence Rushin/College of Computing</em></p>]]></body>                      <image_name><![CDATA[CoC-Awards-Spring-2026_MG_0187.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/23/CoC-Awards-Spring-2026_MG_0187.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/23/CoC-Awards-Spring-2026_MG_0187.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/23/CoC-Awards-Spring-2026_MG_0187.jpg?itok=HwHbZond]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A woman accepting a certificate.]]></image_alt>                    <created>1776965449</created>          <gmt_created>2026-04-23 17:30:49</gmt_created>          <changed>1777037484</changed>          <gmt_changed>2026-04-24 13:31:24</gmt_changed>      </item>          <item>          <nid>680057</nid>          <type>image</type>          <title><![CDATA[Gina-CoC-Awards-Spring-2026_86A0051-1-.jpg]]></title>          <body><![CDATA[<p><em>College of Computing Dean Vivek Sarkar (left) stands with Assistant Director of Business Operations Regina Anderson, recipient of the Ruthie Book Outstanding Staff Team Member Award. Photos by Terence Rushin/College of Computing</em></p>]]></body>                      <image_name><![CDATA[CoC-Awards-Spring-2026_86A0051-1-.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0051-1-.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0051-1-.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0051-1-.jpg?itok=CuHQt47L]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A man and a woman shake hands in front of a step and repeat banner. The woman is holding a certificate.]]></image_alt>                    <created>1777035510</created>          <gmt_created>2026-04-24 12:58:30</gmt_created>          <changed>1777035510</changed>          <gmt_changed>2026-04-24 12:58:30</gmt_changed>      </item>          <item>          <nid>680056</nid>          <type>image</type>          <title><![CDATA[Mary Helen-CoC-Awards-Spring-2026_86A0049.jpg]]></title>          <body><![CDATA[<p><em>College of Computing Dean Vivek Sarkar (left) stands with Director of Research Operations Mary Helen Hayes, recipient of the Outstanding Staff Leadership Award. Photos by Terence Rushin/College of Computing</em></p>]]></body>                      <image_name><![CDATA[CoC-Awards-Spring-2026_86A0049.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0049.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0049.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0049.jpg?itok=2sQ1PlFY]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A man and a woman shake hands in front of a step and repeat banner. The woman is holding a certificate.]]></image_alt>                    <created>1777035510</created>          <gmt_created>2026-04-24 12:58:30</gmt_created>          <changed>1777035510</changed>          <gmt_changed>2026-04-24 12:58:30</gmt_changed>      </item>          <item>          <nid>680053</nid>          <type>image</type>          <title><![CDATA[Taeosoo-CoC-Awards-Spring-2026_86A0026.jpg]]></title>          <body><![CDATA[<p><em>College of Computing Dean Vivek Sarkar (left) stands with Professor Taesoo Kim, recipient of the Outstanding Senior Faculty Research Award. Photos by Terence Rushin/College of Computing</em></p>]]></body>                      <image_name><![CDATA[CoC-Awards-Spring-2026_86A0026.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0026.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0026.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0026.jpg?itok=tvj-uV7I]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Two men shaking hands and standing in front of a step and repeat banner]]></image_alt>                    <created>1777035510</created>          <gmt_created>2026-04-24 12:58:30</gmt_created>          <changed>1777035510</changed>          <gmt_changed>2026-04-24 12:58:30</gmt_changed>      </item>          <item>          <nid>680055</nid>          <type>image</type>          <title><![CDATA[Frank-CoC-Awards-Spring-2026_86A0029.jpg]]></title>          <body><![CDATA[<p><em>College of Computing Dean Vivek Sarkar (left) stands with Associate Professor Frank Li, recipient of the Junior Faculty Research Award. Photos by Terence Rushin/College of Computing</em></p>]]></body>                      <image_name><![CDATA[CoC-Awards-Spring-2026_86A0029.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0029.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0029.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0029.jpg?itok=Ht7HqSM4]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Two men shaking hands. One is holding a certificate. They are standing in front of a step and repest banner.]]></image_alt>                    <created>1777035510</created>          <gmt_created>2026-04-24 12:58:30</gmt_created>          <changed>1777035510</changed>          <gmt_changed>2026-04-24 12:58:30</gmt_changed>      </item>          <item>          <nid>680054</nid>          <type>image</type>          <title><![CDATA[Brendan-CoC-Awards-Spring-2026_86A0027.jpg]]></title>          <body><![CDATA[<p><em>College of Computing Dean Vivek Sarkar (left) stands with Associate Professor Brendan Saltaformaggio, recipient of the Mid-Career Faculty Research Award. Photos by Terence Rushin/College of Computing</em></p>]]></body>                      <image_name><![CDATA[CoC-Awards-Spring-2026_86A0027.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0027.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0027.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/24/CoC-Awards-Spring-2026_86A0027.jpg?itok=9bfLy9Kb]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Two men shaking hands and holding a certificate.]]></image_alt>                    <created>1777035510</created>          <gmt_created>2026-04-24 12:58:30</gmt_created>          <changed>1777035510</changed>          <gmt_changed>2026-04-24 12:58:30</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689931">  <title><![CDATA[From Competition to Community: How Team Atlanta’s AI Cybersecurity Breakthrough Is Going Open Source]]></title>  <uid>36253</uid>  <body><![CDATA[<p>When <a href="https://team-atlanta.github.io/">Team Atlanta</a> claimed first place in the <a href="https://www.cc.gatech.edu/news/georgia-tech-makes-history-wins-darpa-challenge">DARPA AI Cyber Challenge</a> last year, they weren’t just celebrating a win—they were demonstrating that artificial intelligence (AI) could autonomously detect and patch software vulnerabilities at a scale once considered impossible.</p><p>Now, the team is working with the Linux Foundation and the <a href="https://openssf.org/">Open Source Security Foundation</a> (OpenSSF) to ensure that its breakthrough doesn’t remain confined to a competition environment. The team’s new initiative, <a href="https://openssf.org/projects/oss-crs/">OSS-CRS</a>, aims to standardize and operationalize cyber reasoning systems (CRSs) for real-world use.</p><p>“The AI Cyber Challenge pushed the boundaries of autonomous software security, with seven teams developing systems capable of finding and remediating vulnerabilities at scale,” said <strong>Andrew Chin</strong>, a Georgia Tech Ph.D. student and lead on the OSS-CRS program.&nbsp;</p><p>“However, after the competition’s conclusion, it has been difficult to apply these advancements to the open-source community due to infrastructure incompatibilities and the lack of long-term maintenance for the open-sourced CRS implementations.”</p><p>To address this gap, Georgia Tech’s <a href="https://gts3.org/">Systems Software Lab</a> (SSLab), directed by Professor <strong>Taesoo Kim</strong>, is leading the development of OSS-CRS, which provides both a common framework for CRS development and the infrastructure needed to deploy these systems seamlessly across open-source projects.</p><p>As part of this effort, the team has ported its competition-winning system, Atlantis, into the OSS-CRS framework. The move makes it compatible with laptops and other everyday machines with flexible resource and budget configurations.</p><p>Interoperability is also central to the framework’s design. Atlantis can be combined with other CRSs to improve performance, including systems developed by fellow AIxCC finalists and newer agentic, command-line-based tools. This modular approach reflects a key lesson the team learned from the competition: collaboration between systems can outperform any single solution.</p><p>OSS-CRS has been accepted as a <a href="https://github.com/ossf/oss-crs">sandbox project</a> within OpenSSF’s AI/ML Security Working Group, a milestone that brings added technical guidance and community support to the project. This includes:</p><ul><li>Access to mentorship</li><li>Dedicated working group meetings</li><li>Broader visibility through industry events, publications, and outreach efforts</li></ul><p>The collaboration will also foster stronger connections with open-source maintainers, helping streamline vulnerability disclosure and remediation workflows.</p>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1776792511</created>  <gmt_created>2026-04-21 17:28:31</gmt_created>  <changed>1776880203</changed>  <gmt_changed>2026-04-22 17:50:03</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Team Atlanta, winners of the DARPA AI Cyber Challenge, are turning their competition-winning AI cybersecurity system into a real-world tool for the open-source community.]]></teaser>  <type>news</type>  <sentence><![CDATA[Team Atlanta, winners of the DARPA AI Cyber Challenge, are turning their competition-winning AI cybersecurity system into a real-world tool for the open-source community.]]></sentence>  <summary><![CDATA[<p>Team Atlanta, winners of the DARPA AI Cyber Challenge, are turning their competition-winning AI cybersecurity system into a real-world tool for the open-source community. In partnership with the Linux Foundation and the Open Source Security Foundation, the team has launched OSS-CRS, a framework designed to standardize and deploy autonomous cyber reasoning systems at scale. By open sourcing their technology and enabling collaboration between multiple AI systems, the initiative aims to make it easier to detect and fix software vulnerabilities—strengthening the security of critical open-source infrastructure worldwide.</p>]]></summary>  <dateline>2026-04-21T00:00:00-04:00</dateline>  <iso_dateline>2026-04-21T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-04-21 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jpopham3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Popham</p><p>Communications Officer II at the School of Cybersecurity and Privacy</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>680033</item>      </media>  <hg_media>          <item>          <nid>680033</nid>          <type>image</type>          <title><![CDATA[AIxCC-2025-27-web-copy.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[AIxCC-2025-27-web-copy.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/22/AIxCC-2025-27-web-copy.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/22/AIxCC-2025-27-web-copy.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/22/AIxCC-2025-27-web-copy.jpg?itok=ZHAVVebl]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A group of people standing inside of a convention hall. ]]></image_alt>                    <created>1776880174</created>          <gmt_created>2026-04-22 17:49:34</gmt_created>          <changed>1776880174</changed>          <gmt_changed>2026-04-22 17:49:34</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="145171"><![CDATA[Cybersecurity]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689945">  <title><![CDATA[Zoo Atlanta Elephants Embrace New GT-Designed Interactive Enrichment Wall]]></title>  <uid>36530</uid>  <body><![CDATA[<p>Titan, Msholo, Kelly, and Tara are just like any other African elephants — intelligent creatures that require mental stimulation in their everyday lives.</p><p>They would normally get this in their natural habitats while foraging for food and staying alert to predators that might target calves.</p><p>However,&nbsp;<a href="https://zooatlanta.org/animal/african-elephant/">the four elephants reside at Zoo Atlanta</a>, so they don’t have to worry about these things.</p><p>That’s why zoo caretakers are always on the lookout for better ways to help their elephants exercise their brains.</p><p>The caretakers at Zoo Atlanta found one when they met&nbsp;<a href="https://www.ariannamastali.org/"><strong>Arianna Mastali</strong></a>, a Ph.D. student in Georgia Tech’s School of Interactive Computing. Mastali designed an audio enrichment wall to help stimulate Zoo Atlanta’s elephants.</p><p>Many zoos build concrete enrichment walls to foster elephant problem-solving and critical thinking. The walls usually have holes for the elephants to reach through with their trunks as they search for food, treats, or playful objects on the other side.</p><p>Mastali enhanced Zoo Atlanta’s enrichment wall by adding an interactive audio component. A nearby speaker system emits distinctive low-frequency tones when an elephant sticks its trunk into a hole.</p><p>“They’re intelligent creatures that require a lot of complexity in their habitat,” Mastali said. “We wanted to add to that complexity while giving them more control.”</p><h4><strong>Experimenting in the Wild</strong></h4><p>Mastali’s system uses cameras and computer vision to detect when an elephant’s trunk is inside a hole and then sends a signal to the speakers to play a sound.</p><p>Mastali is a member of the&nbsp;<a href="https://animalab.cc.gatech.edu/">Georgia Tech Animal Lab</a>, directed by School of IC professor&nbsp;<a href="https://www.cc.gatech.edu/people/melody-jackson"><strong>Melody Jackson</strong></a>. The lab often uses sensing technology to enhance animal wellness.</p><p>Mastali said she tried incorporating sensing devices into her project several times. She constructed an insert made of PVC pipe and attached a sensor to its base that used infrared beams to detect the elephant’s trunk.</p><p>However, she said it was difficult to account for the elephants’ strength. Their trunks would break the insert after a day or two.&nbsp;</p><p>She pivoted toward computer vision to remove the risk of damage and keep the enrichment wall as close to natural as possible.&nbsp;</p><p>“A big lesson we learned was that using existing materials the elephants are already familiar with was the best way to do things, and it simplified our design process,” she said.</p><p><strong>Shane Rosse</strong>, a student in Georgia Tech’s&nbsp;<a href="https://omscs.gatech.edu/">Online Master of Science in Computer Science</a> (OMSCS) program, assisted Mastali with the computer vision component.</p><h4><strong>Enhancing Environmental Enrichment</strong></h4><p>Mastali observed the elephants’ behavior at the wall seven days before and seven days after the installation of the audio enrichment system.</p><p>The number of times the elephants approached the wall after installation increased by 176%, and time spent at the wall increased by 71%</p><p>“We weren’t sure at first if they would care that much, so it was great to see how much time they spent at the wall, especially our less dominant females,” said Kirby Miller, senior elephant caretaker at Zoo Atlanta. “They seem to like it the most.”</p><p>Miller said the elephants used to only approach the wall when they knew there was food behind it. That started to change after the audio enrichment system was installed.</p><p>“We would be off somewhere else, and we’d hear the speaker playing the sounds, and we knew there wasn’t any food back there,” Miller said. “Tara had her trunk in one of the holes, just listening to the sound. That let us know they do like it, and they’re very curious about it.”</p><p>Miller said because elephants have sharp memories and acute senses of hearing and smell, their habitats must be designed with that in mind.</p><p>Zoo Atlanta’s African Savanna elephant habitat was redesigned in 2019. In addition to the enrichment wall, it includes a bathing pond, two waterfalls, and swing boom devices that hold hay for elephants to eat as they would in the wild.</p><p>Miller said elephants sheltered at any zoo or conservation would benefit from enrichment devices enhanced by technology.</p><p>“I think anything they can participate in that gives them choice and control is great for all zoo elephants,” she said. “It depends on the elephants, but with our elephants, they can hear much higher frequencies than we can. That noise isn’t that loud for us, but for them, they’re feeling that noise, and they can hear much more, which makes it more stimulating for them.”</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1776867653</created>  <gmt_created>2026-04-22 14:20:53</gmt_created>  <changed>1776869055</changed>  <gmt_changed>2026-04-22 14:44:15</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech is working with Zoo Atlanta to design an audio enrichment wall for African elephants.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech is working with Zoo Atlanta to design an audio enrichment wall for African elephants.]]></sentence>  <summary><![CDATA[<p>Georgia Tech Ph.D. student Arianna Mastali designed an interactive audio enrichment wall for Zoo Atlanta's four African elephants. A speaker system plays low-frequency tones when an elephant inserts its trunk into one of the wall's holes, deteced by computer vision.</p>]]></summary>  <dateline>2026-04-22T00:00:00-04:00</dateline>  <iso_dateline>2026-04-22T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-04-22 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>680026</item>          <item>680027</item>          <item>680028</item>          <item>680029</item>          <item>680030</item>      </media>  <hg_media>          <item>          <nid>680026</nid>          <type>image</type>          <title><![CDATA[DSC_2500.jpeg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[DSC_2500.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/22/DSC_2500.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/22/DSC_2500.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/22/DSC_2500.jpeg?itok=5-YVH9XZ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Arianna Mastali stands in front of an African elephant in the background at Zoo Atlanta.]]></image_alt>                    <created>1776867679</created>          <gmt_created>2026-04-22 14:21:19</gmt_created>          <changed>1776867679</changed>          <gmt_changed>2026-04-22 14:21:19</gmt_changed>      </item>          <item>          <nid>680027</nid>          <type>image</type>          <title><![CDATA[DSC_0455.jpeg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[DSC_0455.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/22/DSC_0455.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/22/DSC_0455.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/22/DSC_0455.jpeg?itok=x1g1Dtqb]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Elephant at Zoo Atlanta sticks its trunk into a hole in the enrichment wall]]></image_alt>                    <created>1776867787</created>          <gmt_created>2026-04-22 14:23:07</gmt_created>          <changed>1776867787</changed>          <gmt_changed>2026-04-22 14:23:07</gmt_changed>      </item>          <item>          <nid>680028</nid>          <type>image</type>          <title><![CDATA[DSC_0522.jpeg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[DSC_0522.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/22/DSC_0522.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/22/DSC_0522.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/22/DSC_0522.jpeg?itok=1e2bpRw9]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Elephant uses its trunk to grab hay that is suspended in the air]]></image_alt>                    <created>1776867847</created>          <gmt_created>2026-04-22 14:24:07</gmt_created>          <changed>1776867847</changed>          <gmt_changed>2026-04-22 14:24:07</gmt_changed>      </item>          <item>          <nid>680029</nid>          <type>image</type>          <title><![CDATA[DSC_0500.jpeg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[DSC_0500.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/22/DSC_0500.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/22/DSC_0500.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/22/DSC_0500.jpeg?itok=Z70wlkuE]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Zoo Atlanta visitor walk past the elephant exhibit with an elephant in the background]]></image_alt>                    <created>1776867908</created>          <gmt_created>2026-04-22 14:25:08</gmt_created>          <changed>1776867908</changed>          <gmt_changed>2026-04-22 14:25:08</gmt_changed>      </item>          <item>          <nid>680030</nid>          <type>video</type>          <title><![CDATA[Play That Trunk Music: Elephant Enrichment x Computer Science]]></title>          <body><![CDATA[<p>Elephants require mental stimulation in their everyday lives, which is why Zoo Atlanta redesigned its African Savanna habitat that shelters four African elephants in 2019. The habitat includes an elephant enrichment wall that has numerous holes for elephants to stick their trunks into as they search for food on the other side.</p><p>The elephant enrichment wall at Zoo Atlanta recently received an upgrade thanks to a Georgia Tech Ph.D. student. Arianna Mastali designed an audio enrichment system that uses computer vision to detect when an elephant sticks its trunk into the enrichment wall as it searches for food. The system then sends a signal to play a unique tone from a nearby speaker that corresponds to each hole. So far, Mastali has found that elephant wall interactions have increased by 176%, and the elephants are visiting the wall even when there isn't food behind it.</p>]]></body>                      <youtube_id><![CDATA[ANlIAhp4YTs]]></youtube_id>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>            <vimeo_id><![CDATA[]]></vimeo_id>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>            <video_url><![CDATA[https://www.youtube.com/watch?v=ANlIAhp4YTs]]></video_url>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>                    <created>1776868980</created>          <gmt_created>2026-04-22 14:43:00</gmt_created>          <changed>1776868980</changed>          <gmt_changed>2026-04-22 14:43:00</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="42901"><![CDATA[Community]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="42901"><![CDATA[Community]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="188776"><![CDATA[go-research]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="6765"><![CDATA[zoo atlanta]]></keyword>          <keyword tid="174264"><![CDATA[elephants]]></keyword>          <keyword tid="3237"><![CDATA[enrichment]]></keyword>          <keyword tid="104701"><![CDATA[animal computer interaction lab]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689932">  <title><![CDATA[Vision AI Models Improve Decision Making in Manufacturing, Energy, and Finance]]></title>  <uid>36319</uid>  <body><![CDATA[<p>Generative artificial intelligence (AI) is best known for creating images and text. Now, it is helping industries make better planning decisions.</p><p>Georgia Tech researchers have created a new AI model for decision-focused learning (DFL), called Diffusion-DFL. Recent tests showed it makes more accurate decisions than current approaches.</p><p>Along with optimizing industrial output, Diffusion-DFL lowers costs and reduces risk. Experiments also showed it performs across different fields.&nbsp;</p><p><a href="https://arxiv.org/abs/2510.11590"><strong>Diffusion-DFL</strong></a> doesn’t just surpass current methods; it also predicts more accurately as problem sizes grow. The model requires less computing power despite these high-performance marks, making it more accessible to smaller enterprises.</p><p>Diffusion-DFL runs on diffusion models, the same technology that powers DALL-E and other AI image generators. It is the first DFL framework based on diffusion models.</p><p>“Anyone who makes high-stakes decisions under uncertainty, including supply chain managers, energy operators, and financial planners, benefits from Diffusion-DFL,” said&nbsp;<a href="https://www.zihaozhao.site/"><strong>Zihao Zhao</strong></a>, a Georgia Tech Ph.D. student who led the project.&nbsp;</p><p>“Instead of optimizing around a single forecast, the model evaluates many possible scenarios, so decisions account for real-world risk and become more robust.”</p><p>[<a href="https://sites.gatech.edu/research/iclr-2026/"><strong>Related: GT @ ICLR 2026</strong></a>]</p><p>To test Diffusion-DFL, the team ran experiments based on real-world settings, including:</p><ul><li>Factory manufacturing to meet product demand</li><li>Power grid scheduling to meet energy demand</li><li>Stock market portfolio optimization</li></ul><p>In each case, Diffusion-DFL made more accurate decisions than current methods. It also performed better as problems became larger and more complex. These results confirm the model’s ability to make important decisions in real-world scenarios with noisy data and uncertainty.</p><p>The experiments also show that Diffusion-DFL is practical, not just accurate. Training diffusion models is expensive, so the team developed a way to reduce memory use. This cut training costs by more than 99.7%. As a result, Diffusion-DFL can reach more researchers and practitioners.</p><p>“Our score-function estimator cuts GPU memory from over 60 gigabytes to 0.13 with almost no loss in decision quality, reducing the requirement for massive computing resources,” Zhao said. “I hope this expands Diffusion-DFL into other domains, like healthcare, where decisions must be made quickly under complex uncertainty."</p><p>Beyond decision-making applications, Diffusion-DFL marks a shift in DFL techniques and in the broader use of generative AI models.&nbsp;</p><p>In supply chain management, planners estimate future demand before deciding how much product to stock. In this DFL problem, engineers align ML models with predetermined decision objectives, like minimizing risk or reducing costs.&nbsp;</p><p>One flaw of DFL methods is that they optimize around a single, deterministic prediction in an uncertain future.</p><p>Diffusion-DFL takes a different approach. Instead of making a single guess, it determines a range of possible outcomes. This leads to decisions based on many likely scenarios, rather than on a single assumed future.</p><p>To do this, the framework uses diffusion models. These generative AI models create high-quality data from images, text, and audio.&nbsp;</p><p>The forward diffusion process involves adding noise to data until it becomes pure noise. Models trained via forward diffusion can reverse diffusion. This means they can start with noisy data and then produce meaningful insights from training examples.&nbsp;</p><p>Real-world data is often noisy and uncertain. Traditional DFL methods struggle in these conditions, but diffusion models are designed to handle them.</p><p>Because of this, Diffusion-DFL can explore many possible outcomes and choose better actions. Like image-generation AI, the model works well with complex data from different sources. This enables its use across different industries.</p><p>“Diffusion models have achieved significant success in generative AI and image synthesis, but our work shows their potential extends far beyond that,” said&nbsp;<a href="https://guaguakai.com/"><strong>Kai Wang</strong></a>, an assistant professor in the&nbsp;<a href="https://cse.gatech.edu/"><strong>School of Computational Science and Engineering</strong></a> (CSE).</p><p>“What makes Diffusion-DFL unique is that the specific downstream application guides how the model learns to handle uncertainty.</p><p>“Whether we are scheduling energy for power grids, balancing risk in financial portfolios, or developing early warning systems in healthcare, we can explicitly train these highly expressive models to navigate the unique complexities of each domain.”</p><p>Zhao and Wang collaborated with Caltech Ph.D. candidate&nbsp;<a href="https://chrisyeh96.github.io/"><strong>Christopher Yeh</strong></a> and Harvard University postdoctoral fellow&nbsp;<a href="https://www.cc.gatech.edu/news/alumnus-uses-ai-counter-african-poaching-improve-maternal-healthcare-access"><strong>Lingkai Kong</strong></a> on Diffusion-DFL. Kong earned his Ph.D. in CSE from Georgia Tech in 2024.</p><p>Wang will present Diffusion-DFL on behalf of the group at the upcoming International Conference on Learning Representations (<a href="https://iclr.cc/"><strong>ICLR 2026</strong></a>). Occurring April 23-27 in Rio de Janeiro, ICLR is one of the world’s most prestigious conferences dedicated to artificial intelligence research.</p><p>“ICLR is the perfect stage for Diffusion-DFL because it brings together the exact community that needs to see the bridge between generative modeling and high-stakes decision-making for real-world applications,” Wang said.</p><p>“Presenting Diffusion-DFL allows us to challenge the traditional training framework of diffusion models. It’s about sparking a broader conversation on how we can align the training objectives of generative AI directly with actual, downstream decision-making needs.”</p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1776792924</created>  <gmt_created>2026-04-21 17:35:24</gmt_created>  <changed>1776793239</changed>  <gmt_changed>2026-04-21 17:40:39</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers have developed Diffusion-DFL, the first decision-focused learning model built on diffusion AI technology. It uses the same engineering behind image generators to help industries make more accurate, lower-cost planning decisions.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers have developed Diffusion-DFL, the first decision-focused learning model built on diffusion AI technology. It uses the same engineering behind image generators to help industries make more accurate, lower-cost planning decisions.]]></sentence>  <summary><![CDATA[<p>Generative artificial intelligence (AI) is best known for creating images and text. Now, it is helping industries make better planning decisions.</p><p>Georgia Tech researchers have created a new AI model for decision-focused learning (DFL), called Diffusion-DFL. Recent tests showed it makes more accurate decisions than current approaches.</p><p>Along with optimizing industrial output, Diffusion-DFL lowers costs and reduces risk. Experiments also showed it performs across different fields.&nbsp;</p><p><a href="https://arxiv.org/abs/2510.11590"><strong>Diffusion-DFL</strong></a> doesn’t just surpass current methods; it also predicts more accurately as problem sizes grow. The model requires less computing power despite these high-performance marks, making it more accessible to smaller enterprises.</p><p>Diffusion-DFL runs on diffusion models, the same technology that powers DALL-E and other AI image generators. It is the first DFL framework based on diffusion models.</p>]]></summary>  <dateline>2026-04-15T00:00:00-04:00</dateline>  <iso_dateline>2026-04-15T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-04-15 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>680015</item>      </media>  <hg_media>          <item>          <nid>680015</nid>          <type>image</type>          <title><![CDATA[Diffusion-DFL-Head-Image.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Diffusion-DFL-Head-Image.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/21/Diffusion-DFL-Head-Image.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/21/Diffusion-DFL-Head-Image.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/21/Diffusion-DFL-Head-Image.jpg?itok=VM66uXsh]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[ICLR 2026 Diffusion-DFL]]></image_alt>                    <created>1776792936</created>          <gmt_created>2026-04-21 17:35:36</gmt_created>          <changed>1776792936</changed>          <gmt_changed>2026-04-21 17:35:36</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/vision-ai-models-improve-decision-making-manufacturing-energy-and-finance]]></url>        <title><![CDATA[Vision AI Models Improve Decision Making in Manufacturing, Energy, and Finance]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="131"><![CDATA[Economic Development and Policy]]></category>          <category tid="144"><![CDATA[Energy]]></category>          <category tid="194609"><![CDATA[Industry]]></category>          <category tid="194685"><![CDATA[Manufacturing]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="131"><![CDATA[Economic Development and Policy]]></term>          <term tid="144"><![CDATA[Energy]]></term>          <term tid="194609"><![CDATA[Industry]]></term>          <term tid="194685"><![CDATA[Manufacturing]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="9167"><![CDATA[machine learning]]></keyword>          <keyword tid="181689"><![CDATA[Institute for Data Science and Engineering]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="194384"><![CDATA[Tech AI]]></keyword>          <keyword tid="7850"><![CDATA[EVPR]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39461"><![CDATA[Manufacturing, Trade, and Logistics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689263">  <title><![CDATA[Transformer Explainer Shows How AI is More Math than Human]]></title>  <uid>36319</uid>  <body><![CDATA[<p>While people use search engines, chatbots, and generative artificial intelligence tools every day, most don’t know how they work. This sets unrealistic expectations for AI and leads to misuse. It also slows progress toward building new AI applications.&nbsp;</p><p>Georgia Tech researchers are making AI easier to understand through their work on Transformer Explainer. The free, online tool shows non-experts how ChatGPT, Claude, and other large language models (LLMs) process language.&nbsp;</p><p><a href="https://poloclub.github.io/transformer-explainer/">Transformer Explainer</a> is easy to use and runs on any web browser. It quickly went viral after its debut, reaching 150,000 users in its first three months. More than 563,000 people worldwide have used the tool so far.</p><p>Global interest in Transformer Explainer continues when the team presents the tool at the 2026 Conference on Human Factors in Computing Systems (<a href="https://chi2026.acm.org/">CHI 2026</a>). CHI, the world’s most prestigious conference on human-computer interaction, will take place in Barcelona, April 13-17.</p><p>[<a href="https://sites.gatech.edu/research/chi-2026/">Related: GT @ CHI 2026</a>]</p><p>“There are moments when LLMs can seem almost like a person with their own will and personality, and that misperception has real consequences. For example, there have been cases where teenagers have made poor decisions based on conversations with LLMs,” said Ph.D. student&nbsp;<a href="https://aereeeee.github.io/">Aeree Cho</a>.</p><p>“Understanding that an LLM is fundamentally a model that predicts the probability distribution of the next token helps users avoid taking its outputs as absolute. What you put in shapes what comes out, and that understanding helps people engage with AI more carefully and critically.”</p><p>A transformer is a neural network architecture that changes data input sequence into an output. Text, audio, and images are forms of processed data, which is why transformers are common in generative AI models. They do this by learning context and tracking mathematical relationships between sequence components.</p><p>Transformer Explainer demystifies how transformers work. The platform uses visualization and interaction to show, step by step, how text flows through a model and produces predictions.</p><p>Using this approach, Transformer Explainer impacts the AI landscape in four main ways:</p><ul><li>It counters hype and misconceptions surrounding AI by showing how transformers work.</li><li>It improves AI literacy among users by removing technical barriers and lowering the entry for learning about AI.</li><li>It expands AI education by helping instructors teach AI mechanisms without extensive setup or computing resources.</li><li>It influences future development of AI tools and educational techniques by providing a blueprint for interpretable AI systems.</li></ul><p>“When I first learned about transformers, I felt overwhelmed. A transformer model has many parts, each with its own complex math. Existing resources typically present all this information at once, making it difficult to see how everything fits together,” said&nbsp;<a href="https://gracekimcy.github.io/">Grace Kim</a>, a dual B.S./M.S. computer science student.&nbsp;</p><p>“By leveraging interactive visualization, we use levels of abstraction to first show the big picture of the entire model. Then users click into individual parts to reveal the underlying details and math. This way, Transformer Explainer makes learning far less intimidating.”</p><p>Many users don’t know what transformers are or how they work. The Georgia Tech team found that people often misunderstand AI. Some label AI with human-like characteristics, such as creativity. Others even describe it as working like magic.</p><p>Furthermore, barriers make it hard for students interested in transformers to start learning. Tutorials tend to be too technical and overwhelm beginners with math and code. While visualization tools exist, these often target more advanced AI experts.</p><p>Transformer Explainer overcomes these obstacles through its interactive, user-focused platform. It runs a familiar GPT model directly in any web browser, requiring no installation or special hardware.&nbsp;</p><p>Users can enter their own text and watch the model predict the next word in real time. Sankey-style diagrams show how information moves through embeddings, attention heads, and transformer blocks.</p><p>The platform also lets users switch between high-level concepts and detailed math. By adjusting temperature settings, users can see how randomness affects predictions. This reveals how probabilities drive AI outputs, rather than creativity.</p><p>“Millions of people around the world interact with transformer-driven AI. We believe that it is crucial to bridge the gap between day-to-day user experience and the models' technical reality, ensuring these tools are not misinterpreted as human-like or seen as sentient,” said Ph.D. student&nbsp;<a href="https://www.alexkarpekov.com/">Alex Karpekov</a>.&nbsp;</p><p>“Explaining the architecture helps users recognize that language generated by models is a product of computation, leading to a more grounded engagement with the technology.”&nbsp;</p><p>Cho, Karpekov, and Kim led the development of Transformer Explainer. Ph.D. students&nbsp;<a href="https://alechelbling.com/">Alec Helbling</a>,&nbsp;<a href="https://seongmin.xyz/">Seongmin Lee</a>,&nbsp;<a href="https://bhoov.com/">Ben Hoover</a>, and alumni&nbsp;<a href="https://zijie.wang/">Zijie (Jay) Wang</a> (Ph.D. ML-CSE 2024) and <a href="https://minsuk.com/">Minsuk Kahng</a> (Ph.D. CS-CSE 2019) assisted on the project.&nbsp;</p><p>Professor&nbsp;<a href="https://poloclub.github.io/polochau/">Polo Chau</a> supervised the group and their work. His lab focuses on data science, human-centered AI, and visualization for social good.</p><p>Acceptance at CHI 2026 stems from the team winning the best poster award at the 2024 IEEE Visualization Conference. This recognition from one of the top venues in visualization research highlights Transformer Explainer’s effectiveness in teaching how transformers work.</p><p>“Transformer Explainer has reached over half a million learners worldwide,” said Chau, a faculty member in the School of Computational Science and Engineering.&nbsp;</p><p>“I'm thrilled to see it extend Georgia Tech's mission of expanding access to higher education, now to anyone with a web browser.”</p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1774975377</created>  <gmt_created>2026-03-31 16:42:57</gmt_created>  <changed>1776452289</changed>  <gmt_changed>2026-04-17 18:58:09</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers are making AI easier to understand through their work on Transformer Explainer. The free, online tool shows non-experts how ChatGPT, Claude, and other large language models (LLMs) process language, improving AI literacy.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers are making AI easier to understand through their work on Transformer Explainer. The free, online tool shows non-experts how ChatGPT, Claude, and other large language models (LLMs) process language, improving AI literacy.]]></sentence>  <summary><![CDATA[<p>While people use search engines, chatbots, and generative artificial intelligence tools every day, most don’t know how they work. This sets unrealistic expectations for AI and leads to misuse. It also slows progress toward building new AI applications.&nbsp;</p><p>Georgia Tech researchers are making AI easier to understand through their work on Transformer Explainer. The free, online tool shows non-experts how ChatGPT, Claude, and other large language models (LLMs) process language.&nbsp;</p><p><a href="https://poloclub.github.io/transformer-explainer/">Transformer Explainer</a> is easy to use and runs on any web browser. It quickly went viral after its debut, reaching 150,000 users in its first three months. More than 563,000 people worldwide have used the tool so far.</p><p>Global interest in Transformer Explainer continues when the team presents the tool at the 2026 Conference on Human Factors in Computing Systems (<a href="https://chi2026.acm.org/">CHI 2026</a>). CHI, the world’s most prestigious conference on human-computer interaction, will take place in Barcelona, April 13-17.</p>]]></summary>  <dateline>2026-03-31T00:00:00-04:00</dateline>  <iso_dateline>2026-03-31T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-03-31 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679798</item>          <item>679799</item>      </media>  <hg_media>          <item>          <nid>679798</nid>          <type>image</type>          <title><![CDATA[Transformer-Explainer-Head-Image.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Transformer-Explainer-Head-Image.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/31/Transformer-Explainer-Head-Image.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/31/Transformer-Explainer-Head-Image.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/31/Transformer-Explainer-Head-Image.jpg?itok=130OUqJ3]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[CHI 2026 Transformer Explainer]]></image_alt>                    <created>1774975392</created>          <gmt_created>2026-03-31 16:43:12</gmt_created>          <changed>1774975392</changed>          <gmt_changed>2026-03-31 16:43:12</gmt_changed>      </item>          <item>          <nid>679799</nid>          <type>image</type>          <title><![CDATA[Transformer-Explainer-Text-Image.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Transformer-Explainer-Text-Image.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/31/Transformer-Explainer-Text-Image.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/31/Transformer-Explainer-Text-Image.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/31/Transformer-Explainer-Text-Image.jpg?itok=aZBsyuGc]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[CHI 2026 Transformer Explainer]]></image_alt>                    <created>1774975428</created>          <gmt_created>2026-03-31 16:43:48</gmt_created>          <changed>1774975428</changed>          <gmt_changed>2026-03-31 16:43:48</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/transformer-explainer-shows-how-ai-more-math-human]]></url>        <title><![CDATA[Transformer Explainer Shows How AI is More Math than Human]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50877"><![CDATA[School of Computational Science and Engineering]]></group>      </groups>  <categories>          <category tid="130"><![CDATA[Alumni]]></category>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="130"><![CDATA[Alumni]]></term>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="134"><![CDATA[Student and Faculty]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="166983"><![CDATA[School of Computational Science and Engineering]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="170447"><![CDATA[Institute for Data Engineering and Science]]></keyword>          <keyword tid="176858"><![CDATA[machine learning center]]></keyword>          <keyword tid="9167"><![CDATA[machine learning]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="14646"><![CDATA[human-computer interaction]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="194384"><![CDATA[Tech AI]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689636">  <title><![CDATA[Bad Vibes: AI-Generated Code is Vulnerable, Researchers Warn]]></title>  <uid>36253</uid>  <body><![CDATA[<p>Vibe coding programmers are releasing batches of vulnerable code, according to researchers at the School of Cybersecurity and Privacy (SCP) at Georgia Tech, who have scanned over 43,000 security advisories across the web.</p><p>The programming style relies on using generative artificial intelligence (AI) to create software code using tools like Claude, Gemini, and GitHub Copilot. According to graduate research assistant <strong>Hanqing Zhao</strong> of the <a href="https://gts3.org/">Systems Software &amp; Security Lab</a> (SSLab), no one had been tracking these common vulnerabilities and exposures before the launch of their <a href="https://vibe-radar-ten.vercel.app/">Vibe Security Radar</a>.</p><p>“The vulnerabilities we found lead to breaches,” he said. “Everyone is using these tools now. We need a feedback loop to identify which tools, which patterns, and which workflows create the most risk.”</p><p>The radar extensively scans public vulnerability databases, finds the error for each vulnerability, and then examines the code’s history to find who introduced the bug. If they discover an AI tool's signature, the radar flags it.&nbsp;</p><p>Of the 74 confirmed cases uncovered so far by the tool, 14 are critical risks, and 25 are high. These vulnerabilities include command injection, authentication bypass, and server-side request forgery. Zhao explained that since AI models tend to repeat the same mistakes, an attacker would need to find these bugs just once.&nbsp;</p><p>“Millions of developers using the same models means the same bugs showing up across different projects,” he said. “Find one pattern in one AI codebase, you can scan for it across thousands of repositories.”</p><p>Despite its success, the team has only scratched the surface of the problem. The radar can trace metadata like co-author tags, bot emails, and other known tool signatures, but it can't identify an issue if these markers have been removed.&nbsp;</p><p>The next step is behavioral detection. AI-written code has patterns in how it names variables, structures functions, and handles errors.&nbsp;</p><p>“We're building models that can identify AI code from the code itself, no metadata needed,” said Zhao. “That opens up a lot of cases we currently can't touch.”</p><p>The team is also improving its verification pipeline and expanding its sources to include more vulnerability databases. The goal is to get a more complete picture of AI-introduced vulnerabilities across open source, not just the ones that happen to leave signatures behind.&nbsp;</p><p>As more programmers rely on vibe coding, Zhao warns that it still needs to be reviewed as thoroughly as any other project.&nbsp;</p><p>“The whole point of vibe coding is not reading it afterward, I know,” he said. “But if you're shipping AI output to production, review it the way you'd review a junior developer's pull request. Especially anything around input handling and authentication.”</p><p>When prompting AI, SSLab also recommends providing more detailed instructions to get it closer to production-ready. There are also tools to check the code for vulnerabilities after &nbsp;code it has been generated. Not double-checking could lead to a catastrophe.&nbsp;</p><p>“The attack surface keeps growing,” said Zhao. “More people running AI agents locally means the attacker doesn't need to break into the company infrastructure. They just need one vulnerability in a model context protocol server that someone installed and never reviewed.”</p><p>One reason the attack surfaces are expanding rapidly is AI’s evolution. In the second half of 2025, the Vibe Security Radar found about 18 cases across seven months. Then, in the first three months of 2026, it identified 56. March 2026 alone had 35, more than all of 2025 combined.&nbsp;</p><p>Many tools, like Claude, are now more autonomous, allowing developers to write entire features, create files, and even make architecture decisions.&nbsp;</p><p>“When an agent builds something without authentication, that's not a typo,” said Zhao. “It's a design flaw baked in from the start. Claude Code and Copilot together account for most of what we detect, but that's partly because they leave the clearest signatures.”</p>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1776090722</created>  <gmt_created>2026-04-13 14:32:02</gmt_created>  <changed>1776091440</changed>  <gmt_changed>2026-04-13 14:44:00</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers at the Georgia Tech School of Cybersecurity and Privacy are uncovering a growing risk in modern software development: vulnerabilities introduced by AI-generated code.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers at the Georgia Tech School of Cybersecurity and Privacy are uncovering a growing risk in modern software development: vulnerabilities introduced by AI-generated code.]]></sentence>  <summary><![CDATA[<p>Researchers at the Georgia Tech School of Cybersecurity and Privacy are uncovering a growing risk in modern software development: vulnerabilities introduced by AI-generated code.</p><p>Using the Vibe Security Radar, the team analyzed more than 43,000 security advisories and identified dozens of confirmed vulnerabilities tied to tools like GitHub Copilot, Claude, and Gemini—including critical flaws such as authentication bypass and command injection.</p>]]></summary>  <dateline>2026-04-13T00:00:00-04:00</dateline>  <iso_dateline>2026-04-13T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-04-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jpopham3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Popham</p><p>Communications Officer II at the School of Cybersecurity and Privacy</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679920</item>      </media>  <hg_media>          <item>          <nid>679920</nid>          <type>image</type>          <title><![CDATA[Vibe-Coding.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Vibe-Coding.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/13/Vibe-Coding.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/13/Vibe-Coding.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/13/Vibe-Coding.jpg?itok=NCPNum0u]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A man typing on a computer. There is a hovering screen hovering over his hands that says "Vibe Coding"]]></image_alt>                    <created>1776090752</created>          <gmt_created>2026-04-13 14:32:32</gmt_created>          <changed>1776090752</changed>          <gmt_changed>2026-04-13 14:32:32</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>      </news_terms>  <keywords>          <keyword tid="2835"><![CDATA[ai]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="186861"><![CDATA[go-cyber]]></keyword>          <keyword tid="194393"><![CDATA[AI and Cybersecurity]]></keyword>          <keyword tid="1404"><![CDATA[Cybersecurity]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="145171"><![CDATA[Cybersecurity]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689446">  <title><![CDATA[GTRI Supports Initiative to Assess Quantum Computing Efforts]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Quantum computers may one day enable revolutionary advances in fluid dynamics, drug discovery, development of better agricultural fertilizers, improved materials design and other technical areas that are beyond the capabilities of today’s conventional computers. To reach those goals, companies from around the world are pursuing a variety of approaches aimed at developing large-scale, fault-tolerant quantum computers.<br>&nbsp;</p><p>The approaches of over a dozen quantum computing companies are now being evaluated through the Quantum Benchmarking Initiative (QBI), a project of the U.S. Defense Advanced Research Projects Agency (DARPA). According to the agency, QBI “aims to rigorously verify and validate whether any quantum computing approach can achieve utility-scale operation – meaning its computational value exceeds its cost – by the year 2033.”<br>&nbsp;</p><p>Supporting the effort, a 40-person interdisciplinary research team from the Georgia Tech Research Institute (GTRI) has joined the test and evaluation component of QBI, providing unbiased subject-matter experts to work with 13 other research organizations in evaluating the R&amp;D plans of participating quantum computer companies. Through this collaboration, the GTRI team is working with more than 400 other third-party experts on the project.<br>&nbsp;</p><p><a href="https://www.gtri.gatech.edu/newsroom/gtri-supports-initiative-assess-quantum-computing-efforts">Read the complete article on the GTRI news site</a></p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1775237356</created>  <gmt_created>2026-04-03 17:29:16</gmt_created>  <changed>1775237758</changed>  <gmt_changed>2026-04-03 17:35:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers are supporting a Defense Advanced Research Projects Agency (DARPA) initiative to evaluate different approaches to quantum computing.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers are supporting a Defense Advanced Research Projects Agency (DARPA) initiative to evaluate different approaches to quantum computing.]]></sentence>  <summary><![CDATA[<p>The approaches of over a dozen quantum computing companies are now being evaluated through the Quantum Benchmarking Initiative (QBI), a project of the U.S. Defense Advanced Research Projects Agency (DARPA). GTRI researchers are supporting the initiative.</p>]]></summary>  <dateline>2026-04-03T00:00:00-04:00</dateline>  <iso_dateline>2026-04-03T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-04-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[gtri.media@gtri.gatech.edu]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679845</item>      </media>  <hg_media>          <item>          <nid>679845</nid>          <type>image</type>          <title><![CDATA[Quantum computing could enable revolutionary advances in numerous technology areas]]></title>          <body><![CDATA[<p>Quantum computers may one day enable revolutionary advances in fluid dynamics, drug discovery, development of better agricultural fertilizers, improved materials design and other technical areas. (Credit: Tim Hynes)</p>]]></body>                      <image_name><![CDATA[Quantum_banner_03B_03-web.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/03/Quantum_banner_03B_03-web.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/03/Quantum_banner_03B_03-web.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/03/Quantum_banner_03B_03-web.jpg?itok=6BUQqpeg]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Quantum research and potential benefits]]></image_alt>                    <created>1775236418</created>          <gmt_created>2026-04-03 17:13:38</gmt_created>          <changed>1775236825</changed>          <gmt_changed>2026-04-03 17:20:25</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="193653"><![CDATA[Georgia Tech Research Institute]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689428">  <title><![CDATA[Researchers Build AI Tutor Grounded in Course Materials]]></title>  <uid>36532</uid>  <body><![CDATA[<p>As students increasingly turn to artificial intelligence (AI) to help with coursework, some worry that their learning could be compromised. Georgia Tech researchers are working to counter this potential decline with an AI tool they hope will promote learning rather than hinder it.&nbsp;&nbsp;</p><p>TokenSmith is a citation-supported large language model (LLM) tutor that can be hosted locally on a user’s personal computer. The tutor only provides answers based on course materials, such as the textbook or lecture slides.&nbsp;&nbsp;</p><p>Associate Professor <a href="https://faculty.cc.gatech.edu/~jarulraj/"><strong>Joy Arulraj</strong></a> began the project with support from the <a href="https://research.gatech.edu/c21u-announces-inaugural-bill-kent-ai-higher-education-fellows"><strong>Bill Kent Family Foundation AI in Higher Education Faculty Fellowship</strong></a> last year. The fellowship, led by Georgia Tech’s Center for 21st Century Universities, supports faculty projects exploring innovative and ethical uses of AI in teaching.&nbsp;&nbsp;&nbsp;</p><p>Arulraj has enlisted assistant professors <a href="https://kexinrong.github.io/"><strong>Kexin Rong</strong></a> and <a href="https://steve.mussmann.us/"><strong>Steve Mussmann</strong></a> to help build TokenSmith.&nbsp;&nbsp;</p><p>Mussmann said TokenSmith is a synergistic blend of a database system and a machine learning system. The model stores textbooks, textbook annotations by course staff, common questions and answers, a learning state of the student, and student feedback in a structured database system. However, machine learning plays a key role in the answer generation as well as adapting the system to the student, course staff guidance, and user feedback.</p><p>"What excites me most is demonstrating how data-driven ML and principled database systems design can reinforce each other — one providing adaptability and flexibility, the other providing structure and traceability — in a way that benefits students," Mussmann said.</p><p>Keeping the model local has been an important focus of the project. The team wanted to create an AI tutor that helps students learn from their class resources rather than just giving answers. With each response, TokenSmith cites the origin of the answer in the provided documents.&nbsp;&nbsp;</p><p>“One problem with LLMs is that they can hallucinate and provide wrong answers, but in this controlled environment, we can add these guardrails to make sure it’s actually helpful in an educational setting,” Rong said.&nbsp;&nbsp;</p><p>Rong said she feels that students often undervalue textbooks, and she hopes TokenSmith can motivate students to make better use of them. &nbsp;</p><p>“Textbooks can sometimes be daunting, but maybe if we combine them with the model, students might be more willing to read a paragraph or page in the textbook, and that could help clarify something for them,” she said.&nbsp;&nbsp;</p><p>Running the model locally is more cost-effective and helps preserve the user’s privacy. But running the new tool locally comes with technical challenges.&nbsp;&nbsp;</p><p>One challenge with creating the model is speed. Since it is a locally based model, TokenSmith depends solely on the user’s computer memory. &nbsp;Tests have also shown that the tutor currently struggles to answer more complex questions.&nbsp;</p><p>“We are interested in pushing the boundaries of these local models so that they give students good answers and also run fast enough to keep students engaged,” Arulraj said.&nbsp;&nbsp;</p>]]></body>  <author>Morgan Usry</author>  <status>1</status>  <created>1775161502</created>  <gmt_created>2026-04-02 20:25:02</gmt_created>  <changed>1775161836</changed>  <gmt_changed>2026-04-02 20:30:36</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[TokenSmith is a citation-supported large language model (LLM) tutor that can be hosted locally on a user’s personal computer. The tutor only provides answers based on course materials, such as the textbook or lecture slides.  ]]></teaser>  <type>news</type>  <sentence><![CDATA[TokenSmith is a citation-supported large language model (LLM) tutor that can be hosted locally on a user’s personal computer. The tutor only provides answers based on course materials, such as the textbook or lecture slides.  ]]></sentence>  <summary><![CDATA[<p>TokenSmith is a citation-supported large language model (LLM) tutor that can be hosted locally on a user’s personal computer. The tutor only provides answers based on course materials, such as the textbook or lecture slides.&nbsp;&nbsp;</p><p>Associate Professor <a href="https://faculty.cc.gatech.edu/~jarulraj/"><strong>Joy Arulraj</strong></a> began the project with support from the <a href="https://research.gatech.edu/c21u-announces-inaugural-bill-kent-ai-higher-education-fellows"><strong>Bill Kent Family Foundation AI in Higher Education Faculty Fellowship</strong></a> last year. The fellowship, led by Georgia Tech’s Center for 21st Century Universities, supports faculty projects exploring innovative and ethical uses of AI in teaching.</p>]]></summary>  <dateline>2026-04-02T00:00:00-04:00</dateline>  <iso_dateline>2026-04-02T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-04-02 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[morgan.usry@cc.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Morgan Usry, Communications Officer</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679842</item>      </media>  <hg_media>          <item>          <nid>679842</nid>          <type>image</type>          <title><![CDATA[AI-Tutor-Image.jpg.jpeg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[AI-Tutor-Image.jpg.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/04/02/AI-Tutor-Image.jpg.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/04/02/AI-Tutor-Image.jpg.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/04/02/AI-Tutor-Image.jpg.jpeg?itok=Xnge4x3r]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Graphic showing the researchers in front of a computer screen]]></image_alt>                    <created>1775161510</created>          <gmt_created>2026-04-02 20:25:10</gmt_created>          <changed>1775161510</changed>          <gmt_changed>2026-04-02 20:25:10</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50875"><![CDATA[School of Computer Science]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="42911"><![CDATA[Education]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="42911"><![CDATA[Education]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="193860"><![CDATA[Artifical Intelligence]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="194701"><![CDATA[go-resarchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="194394"><![CDATA[AI in Education]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689256">  <title><![CDATA[New Study Shows Explainability is a Must for Older Adults to Trust AI]]></title>  <uid>36530</uid>  <body><![CDATA[<p>Voice-activated, conversational artificial intelligence (AI) agents must provide clear explanations for their suggestions, or older adults aren’t likely to trust them.</p><p>That’s one of the main findings from a study by AI Caring on what older adults expect from explainable AI (XAI).</p><p><a href="https://ai-caring.org/"><strong>AI Caring</strong></a> is one of three AI Institutions led by Georgia Tech and funded by the National Science Foundation (NSF). The institution supports AI research that benefits older adults and their caregivers.</p><p>Niharika Mathur, a Ph.D. candidate in the School of Interactive Computing, was the lead author of a paper based on the study. The paper will be presented in April at the <a href="https://chi2026.acm.org/"><strong>2026 ACM Conference on Human Factors in Computing Systems (CHI) in Barcelona</strong></a>.</p><p>Mathur worked with the <a href="https://empowerment.emory.edu/"><strong>Cognitive Empowerment Program at Emory University</strong></a> to interview 23 older adults who live alone and use voice-activated AI assistants like Amazon’s Alexa and Google Home.</p><p>Many of them told her they feel excluded from the design of these products.</p><p>“The assumption is that all people want interactions the same way and across all kinds of situations, but that isn’t true,” Mathur said. “How older people use AI and what they want from it are different from what younger people prefer.”</p><p>One example she gave is that young people tend to be informal when talking with AI. Older people, on the other hand, talk to the agent like they would a person.</p><p>“If Older adults are talking to their family members about Alexa, they usually refer to Alexa as ‘she’ instead of ‘it,’” Mathur said. “They tend to humanize these systems a lot more than young people.”</p><h4><strong>Good Explanations</strong></h4><p>The study evaluated AI explanations that drew information from four sources of data:</p><ul><li>User history (past conversations with the agent)</li><li>Environmental data (indoor temperature or the weather forecast)</li><li>Activity data (how much time a user spends in different areas of the home)</li><li>Internal reasoning (mathematical probabilities and likely outcomes)</li></ul><p>Mathur said older users trust the agent more when it bases its explanations on data from the first three sources. However, internal reasoning creates skepticism.</p><p>Internal reasoning means the AI doesn’t have enough data from the other sources to give an explanation. It provides a percentage to reflect its confidence based on what it knows.</p><p>“The overwhelming response was negative toward confidence scores,” Mathur said. “If the AI says it’s 92% confident, older adults want to know what that’s based on.”</p><p>This is another example that Mathur said points to generational preferences.</p><p>“There’s a lot of explainable AI research that shows younger people like to see numbers in explanations, and they also tend to rely too much on explanations that contain numerical confidence. Older adults are the opposite. It makes them trust it less.”</p><h4><strong>Knowing the Context</strong></h4><p>Mathur said that AI agents interacting with older adults should serve a dual purpose. They should provide users with companionship and support independence while reducing the caretaking burden often placed on family members.&nbsp;</p><p>Some studies have shown that engineers have tended to favor caretakers in the design of these tools. They prioritize daily tasks and routines, leaving some older adults to feel like they are merely a box to be checked.</p><p>She discovered that in urgent situations, older users prefer the AI to be straightforward, while in casual settings, they desire more conversation.</p><p>“How people interact with technological systems is grounded in what the stakes of the situation are,” she said. “If it had anything to do with their immediate sense of safety, they did not want conversational elaboration. They want the AI to be very direct and factual.”</p><h4><strong>Not Just Checking Boxes</strong></h4><p>Mathur said AI agents that interact with older adults are ideally constructed with a dual purpose. They should provide companionship and autonomy for the users while alleviating the burden of caretaking that is often placed on their family members.&nbsp;</p><p>Some studies have shown that engineers have strayed toward favoring caretakers in the design of these tools. They prioritize daily tasks and routines, leaving some older adults to feel like they are a box to be checked.</p><p>“They’re not being thought of as consumers,” Mathur said. “A lot of products are being made for them but not with them.”</p><p>She also said psychological well-being is one of the most important outcomes these tools should produce.&nbsp;</p><p>Showing older adults that they are listened to can significantly help in gaining their trust. Some interviewees told Mathur they want agents who are deliberate about understanding their preferences and don’t dismiss their questions.</p><p>Meeting these needs reduces the likelihood of protesting and creating conflict with family members.</p><p>“It highlights just how important well-designed explanations are,” she said. “We must go beyond a transparency checklist.”</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1774965667</created>  <gmt_created>2026-03-31 14:01:07</gmt_created>  <changed>1774965899</changed>  <gmt_changed>2026-03-31 14:04:59</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A Georgia Tech study finds older adults are more likely to trust voice-activated AI systems when those systems clearly explain how and why they make decisions.]]></teaser>  <type>news</type>  <sentence><![CDATA[A Georgia Tech study finds older adults are more likely to trust voice-activated AI systems when those systems clearly explain how and why they make decisions.]]></sentence>  <summary><![CDATA[<p>An AI Caring study led by Georgia Tech researchers shows that older adults are more likely to trust conversational AI systems that provide them with clear explanations for their decision-making. The study also shows that including older adults more in the design process benefits their well-being and reduces the caretaking burden of family members</p>]]></summary>  <dateline>2026-03-31T00:00:00-04:00</dateline>  <iso_dateline>2026-03-31T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-03-31 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679796</item>      </media>  <hg_media>          <item>          <nid>679796</nid>          <type>image</type>          <title><![CDATA[0A6A0355.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[0A6A0355.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/31/0A6A0355.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/31/0A6A0355.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/31/0A6A0355.jpg?itok=eU9yywHp]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[An older couple sitting on a couch as a man helps them use Amazon's Alexa]]></image_alt>                    <created>1774965687</created>          <gmt_created>2026-03-31 14:01:27</gmt_created>          <changed>1774965687</changed>          <gmt_changed>2026-03-31 14:01:27</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="193860"><![CDATA[Artifical Intelligence]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="14342"><![CDATA[older adults]]></keyword>          <keyword tid="148721"><![CDATA[Amazon Alexa]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689250">  <title><![CDATA[Researchers Look to Bolster Technology Support for Menopause]]></title>  <uid>36530</uid>  <body><![CDATA[<p>Women in need of supportive maternal and menstrual healthcare in patriarchal societies have increasingly found outlets for disclosure in online communities.</p><p>That support, however, begins to disappear in these restrictive cultures once women reach menopause, according to new research from Georgia Tech</p><p>Naveena Karusala, an assistant professor in Georgia Tech’s School of Interactive Computing, and master’s student Umme Ammara are working toward improving existing technologies and designing new ones for a demographic they believe has been neglected.</p><p>Karusala and Ammara co-authored a paper based on a study they conducted with women in urban Pakistan experiencing menopause.</p><p>“Women’s health is understudied in general, but menopause is more neglected than other women’s health issues,” Karusala said. “Our choice to focus on menopause is motivated by expanding how we holistically think about women’s well-being across their lifespan.”</p><p>Karusala and Ammara will present their paper in April at the 2026 ACM Conference on Human Factors in Computing Systems (CHI) in Barcelona.</p><h4><strong>Masking Symptoms</strong></h4><p>Menopause is diagnosed after 12 consecutive months without a period, vaginal bleeding, or spotting. The transition to menopause, called perimenopause, usually happens over two to eight years.</p><p>Hormone changes may cause symptoms such as irregular periods, vaginal dryness, hot flashes, night sweats, trouble sleeping, mood swings, and brain fog.</p><p>These symptoms can be debilitating in some cases and affect daily life. However, Ammara said women are pressured to remain silent, maintain appearances, and regulate their emotions to meet social expectations.</p><p>“Understanding menopause is important because a woman would be experiencing all these symptoms, and people will not understand those as actual symptoms,” Ammara said. “There’s been resistance to the idea of the medicalization of menopause. People don’t view it as an illness, but as a life transition and something that happens naturally.”</p><h4><strong>Feeling Isolated</strong></h4><p>The women interviewed by Karusala and Ammara either stayed at home full-time or were part of the workforce.</p><p>The researchers discovered that trusted family members might be the only sources women who stay at home and do not work turn to for disclosure.&nbsp;</p><p>“Women at home have the flexibility to take breaks or work at their own pace, so a lot of their experience is shaped by the emotional barriers they face,” Ammara said.&nbsp;</p><p>“That could come from their husbands and family members. Some are supportive and some are not. They might weaponize it and use that term against them, or they might dismiss what they’re going through.”</p><p>Ammara said it might be easier for women in the workforce to confide in their coworkers, but explaining to an employer that they need sick leave for menopause symptoms can be intimidating.</p><p>Even in online communities that have enabled women to anonymously share their health experiences, menopause is seldom discussed.</p><h4><strong>Raising Awareness</strong></h4><p>Karusala and Ammara argue in their paper that a public health approach could be the most effective way to spark conversation about menopause in a patriarchal culture in which technology use varies.</p><p>They said the challenge in implementing technologies geared toward menopause support is that the condition isn’t well understood in public. Improving maternal health, for example, is easier to promote within these societies because of the general understanding that motherhood is important.</p><p>“There must be an existing infrastructure to build on,” Karusala said. “For example, menstrual and maternal health are taught in schools and regularly discussed in primary care. Cultural and social meaning and importance are placed on motherhood.</p><p>“A lot of that doesn’t exist for menopause. Primary care doctors are unprepared to talk about menopause compared to other health issues.”</p><h4><strong>Design Solutions</strong></h4><p>Ammara said that the most effective way for technologies to make an impact on women going through menopause is to directly address systemic power structures around women’s health within Pakistani culture.</p><p>It can start with the husbands.&nbsp;</p><p>“Framing the issue for husbands to understand menopause should be at the forefront of designing technology solutions,” she said.&nbsp;</p><p>“In Islamic contexts, we suggest using faith-based framings. This has been proposed for maternal health in prior works that draw on Islamic principles to engage expectant fathers in providing care and support. Framing it around religious responsibility to involve men in the journey can also be done for menopause.”</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1774958953</created>  <gmt_created>2026-03-31 12:09:13</gmt_created>  <changed>1774963087</changed>  <gmt_changed>2026-03-31 13:18:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers are looking at how technology can better support women experiencing menopause in urban Pakistan, where patriarchal norms leave them largely isolated and without resources for managing their symptoms.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers are looking at how technology can better support women experiencing menopause in urban Pakistan, where patriarchal norms leave them largely isolated and without resources for managing their symptoms.]]></sentence>  <summary><![CDATA[<p>Georgia Tech assistant professor Naveena Karusala and master's student Umme Ammara are researching how to improve existing technologies and design new ones to better support women experiencing menopause. Their work is based on a study conducted with women in urban Pakistan, where patriarchal social norms pressure women to stay silent about menopause symptoms and limit their ability to seek support, even in online communities that have otherwise helped women discuss other health issues</p>]]></summary>  <dateline>2026-03-30T00:00:00-04:00</dateline>  <iso_dateline>2026-03-30T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-03-30 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:ndeen6@gatech.edu">Nathan Deen</a><br>College of Computing<br>Georgia Tech</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679788</item>      </media>  <hg_media>          <item>          <nid>679788</nid>          <type>image</type>          <title><![CDATA[Ammara-Umme_86A2210.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Ammara-Umme_86A2210.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/31/Ammara-Umme_86A2210.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/31/Ammara-Umme_86A2210.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/31/Ammara-Umme_86A2210.jpg?itok=CxqLrfAa]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Umme Ammar sits in a booth with laptop in front of her]]></image_alt>                    <created>1774958961</created>          <gmt_created>2026-03-31 12:09:21</gmt_created>          <changed>1774958961</changed>          <gmt_changed>2026-03-31 12:09:21</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="8900"><![CDATA[women&#039;s history month]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="3543"><![CDATA[women&#039;s health]]></keyword>          <keyword tid="171911"><![CDATA[women of pakistan]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71891"><![CDATA[Health and Medicine]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689175">  <title><![CDATA[Tech Swarms into Athens for Clean, Old-Fashioned Computing]]></title>  <uid>36319</uid>  <body><![CDATA[<p>The in-state rivalry between the Yellow Jackets and the Bulldogs usually heats up when Georgia Tech visits the University of Georgia. However, one Saturday last month, the focus shifted from competition to collaboration.&nbsp;</p><p>The Georgia Scientific Computing Symposium (GSCS) held its annual meeting on February 21 in Athens. Since 2009, the event has hosted researchers from across the Peach State to showcase homegrown advances in scientific computing.</p><p><a href="https://haoningwu.github.io/GSCS2026.html">The symposium</a> highlighted Georgia’s reputation as a computing innovation hub. People from around the world come to Georgia universities to lead computing research. By advancing science, engineering, medicine, and technology, their work improves communities at home and abroad.</p><p>Faculty and students from Georgia Tech, UGA, Georgia State University, and Emory University presented at the symposium. Georgia Tech participants came from the colleges of Computing, Engineering, and Sciences.</p><p>This year’s organizers agreed to meet in Atlanta for the 2027 symposium. Georgia Tech’s <a href="https://cse.gatech.edu/">School of Computational Science and Engineering (CSE)</a> will host the 19th GSCS.</p><p>“From healthcare to computer chip design, scientific computing underpins many of the technological advances we see in our lives,” said Professor&nbsp;<a href="https://faculty.cc.gatech.edu/~echow/">Edmond Chow</a>, associate chair of the School of CSE.</p><p>“Scientific computing provides the mathematical models, simulations, and data‑driven tools that make modern innovation possible. It allows people to analyze complex systems, test ideas virtually before building them, and make faster, more accurate decisions across nearly every sector of society.”</p><p>Professor&nbsp;<a href="https://hmzhou.math.gatech.edu/">Haomin Zhou</a> and Assistant Professor&nbsp;<a href="https://itshelenxu.github.io/">Helen Xu</a> delivered two of the symposium’s five plenary talks.&nbsp;</p><p>Zhou presented a new method for solving the Schrödinger equation, a landmark equation in quantum mechanics. Drawing inspiration from the mathematics used in generative artificial intelligence models, his approach develops an algorithm that more effectively simulates waves, particle motion, and other physical systems.</p><p>Xu focused on improving how computers move and organize data during complex calculations. Her work uses “cache-friendly” layouts that help computers access data more efficiently, boosting performance for scientific and engineering applications.</p><p>“Speaking at GSCS was a great opportunity,” Xu said. “The symposium fostered connections within the scientific computing community and gave us a chance to share exciting research.”</p><p>The symposium showcased student work through a poster blitz and a poster session. During the blitz, 36 students each had one minute to introduce their research to the full audience. They then shared more details about their research during the poster session.</p><p>The student projects showed the range of fields supported by scientific computing. The session also provided attendees with an opportunity to connect and expand their professional networks, helping grow the field’s future impact.</p><p>“As an aerospace engineer by training and aspiring computational scientist, GSCS gave me the platform to network with other researchers in the field while showcasing my own research,” said M.S. student <strong>Kashvi Mundra</strong>.&nbsp;</p><p>“I was able to connect with scientists across different disciplines whose work intersects with my own in unexpected ways. Those conversations pushed my thinking beyond my own lab's perspective, helping me see my work on physics-informed machine learning for inverse problems in a broader scientific computing context.”</p><p>Georgia Tech students who presented posters included:</p><p><strong>Abir Haque</strong> (CSE), <em>Massively Parallel Random Phase Approximation Correlation Energy via Lanczos Quadrature</em></p><p><strong>Antonio Varagnolo</strong> (CSE), <em>Physics-Enhanced Deep Surrogates for the Phonon Boltzmann Transport Equation</em></p><p><strong>Ben Burns</strong> (CSE), <em>Infinite-Dimensional Stein Variational Inference with Derivative-Informed Neural Operators</em></p><p><strong>Ben Wilfong</strong> (CSE), <em>Shocks without Shock Capturing; Compressible Flow at 1 quadrillion Degrees of Freedom without Loss of Accuracy</em></p><p><strong>Daniel Vickers</strong> (CSE), <em>Highly-Parallel Fluid-Solid Interactions for Compressible Flows</em></p><p><strong>Eric Fowler</strong> (CSE), <em>High-Performance Tensor Contractions in Computational Chemistry</em></p><p><strong>Haoran Yan</strong> (Math), <em>Understanding Denoising Autoencoders through the Manifold Hypothesis: A Geometric Perspective</em></p><p><strong>Kashvi Mundra</strong> (CSE), <em>Autoregressive Multifidelity Neural Surrogate Modeling under Scarce Data Regimes</em></p><p><strong>Sebastián Gutiérrez Hernández</strong> (Math/CSE), <em>PDPO: Parametric Density Path Optimization</em></p><p><strong>Vivian Zhang</strong> (AE), <em>Multifidelity Operator Inference: Non-Intrusive Reduced Order Modeling from Scarce Data</em></p><p><strong>Xian Mae Hadia</strong> (CSE), <em>Data Efficiency of Surrogate Models: Learning Physics Data from Full Field Data vs. Inductive Bias from Approximate PDE Solvers</em></p><p><strong>Xiangming Huang</strong> (CSE), <em>Neural Operator Accelerated Evolutionary Strategies for PDE-Constraint Optimization</em></p><p><strong>Zhaiming Shen</strong> (Math), <em>Understanding In-Context Learning on Structured Manifolds: Bridging Attention to Kernel Methods</em></p><p><strong>Zhongjie Shi</strong> (Math), <em>Towards Understanding Generalization in DP-GD: A Case Study in Training Two-Layer CNNs</em></p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1774443853</created>  <gmt_created>2026-03-25 13:04:13</gmt_created>  <changed>1774467666</changed>  <gmt_changed>2026-03-25 19:41:06</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers from universities across Georgia, including Georgia Tech, set aside rivalry to collaborate at the 2026 Georgia Scientific Computing Symposium, highlighting the state’s growing role as a hub for innovation in scientific computing.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers from universities across Georgia, including Georgia Tech, set aside rivalry to collaborate at the 2026 Georgia Scientific Computing Symposium, highlighting the state’s growing role as a hub for innovation in scientific computing.]]></sentence>  <summary><![CDATA[<p>The in-state rivalry between the Yellow Jackets and the Bulldogs usually heats up when Georgia Tech visits the University of Georgia. However, one Saturday last month, the focus shifted from competition to collaboration.&nbsp;</p><p>The Georgia Scientific Computing Symposium (GSCS) held its annual meeting on February 21 in Athens. Since 2009, the event has hosted researchers from across the Peach State to showcase homegrown advances in scientific computing.</p><p><a href="https://haoningwu.github.io/GSCS2026.html">The symposium</a> highlighted Georgia’s reputation as a computing innovation hub. People from around the world come to Georgia universities to lead computing research. By advancing science, engineering, medicine, and technology, their work improves communities at home and abroad.</p>]]></summary>  <dateline>2026-03-25T00:00:00-04:00</dateline>  <iso_dateline>2026-03-25T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-03-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679732</item>          <item>679733</item>      </media>  <hg_media>          <item>          <nid>679732</nid>          <type>image</type>          <title><![CDATA[GSCS-2026-Head-Image.jpeg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[GSCS-2026-Head-Image.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/25/GSCS-2026-Head-Image.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/25/GSCS-2026-Head-Image.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/25/GSCS-2026-Head-Image.jpeg?itok=epVOcqtb]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[2026 Georgia Scientific Computing Symposium]]></image_alt>                    <created>1774443866</created>          <gmt_created>2026-03-25 13:04:26</gmt_created>          <changed>1774443866</changed>          <gmt_changed>2026-03-25 13:04:26</gmt_changed>      </item>          <item>          <nid>679733</nid>          <type>image</type>          <title><![CDATA[Kashvi-Mundra-Poster.jpeg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Kashvi-Mundra-Poster.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/25/Kashvi-Mundra-Poster.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/25/Kashvi-Mundra-Poster.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/25/Kashvi-Mundra-Poster.jpeg?itok=RJv8HI6y]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[2026 Georgia Scientific Computing Symposium]]></image_alt>                    <created>1774443901</created>          <gmt_created>2026-03-25 13:05:01</gmt_created>          <changed>1774443901</changed>          <gmt_changed>2026-03-25 13:05:01</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/tech-swarms-athens-clean-old-fashioned-computing]]></url>        <title><![CDATA[Tech Swarms into Athens for Clean, Old-Fashioned Computing]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50877"><![CDATA[School of Computational Science and Engineering]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="194611"><![CDATA[State Impact]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="194611"><![CDATA[State Impact]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="166983"><![CDATA[School of Computational Science and Engineering]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="168681"><![CDATA[scientific computing]]></keyword>          <keyword tid="194970"><![CDATA[2026 Georgia Scientific Computing Symposium]]></keyword>      </keywords>  <core_research_areas>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39541"><![CDATA[Systems]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689185">  <title><![CDATA[Researchers Find Training Gaps Impacting Maritime Cybersecurity Readiness]]></title>  <uid>36253</uid>  <body><![CDATA[<p>Whether it’s a fire or a flood, a ship’s crew can only rely on itself and its training in emergencies at sea. The same is true for crews facing digital threats on oil tankers, cargo ships, and other commercial vessels.</p><p>New cybersecurity research from the Georgia Institute of Technology, however, revealed that crews aboard commercial vessels were often not adequately prepared to manage cyberattacks effectively due to systemic training gaps.</p><p>The findings are based on interviews conducted by researchers with more than 20 officer-level mariners to assess the maritime industry’s readiness to handle cybersecurity attacks at sea.</p><p>"Historically, cybersecurity research has focused heavily on cyber-physical systems like cars, factories, and industrial plants, but ships have largely been overlooked,” said <a href="https://annaraymaker.dad/"><strong>Anna Raymaker</strong></a>, Ph.D. student and lead researcher.</p><p>“That gap is concerning when more than 90% of the world’s goods travel by sea. Recent incidents, from GPS spoofing to ships linked to subsea cable disruptions, show that maritime systems are increasingly part of the global cyber threat landscape.”</p><p>The researchers proposed four practical strategies to strengthen maritime cyber defenses and close the training gaps. Their findings were presented recently at the <a href="https://www.sigsac.org/ccs/CCS2025/call-for-papers/">ACM SIGSAC Conference on Computer and Communications Security (CCS).</a></p><h6>1. Make Cybersecurity Training Actually Maritime</h6><p>Many of those interviewed for the study described current cybersecurity training as “boilerplate” — generic modules that don’t reflect real shipboard risks.&nbsp;</p><p>Researchers recommend:</p><ul><li>Role-specific instruction: Navigation officers should learn to detect and identify GPS spoofing. Engineers should focus on vulnerabilities in remotely monitored systems.</li><li>Bridging IT and Operational Technology: Crews need to understand how attacks on IT systems can trigger physical consequences in operational technology — including collisions, groundings, or explosions.</li><li>Hands-on delivery: Replace passive PowerPoints with drills and in-person exercises that build muscle memory.</li><li>Accessible standards: Training must account for the wide range of educational backgrounds across crews and be standardized across ranks.</li></ul><h6>2. Move Beyond “Call IT”</h6><p>At sea, crews can’t simply escalate a cyber incident to a shore-based IT department and wait. Operational resilience requires onboard readiness.</p><p>Researchers recommend:</p><ul><li>Vessel-specific response plans: Ships need clear, actionable protocols for threats such as AIS jamming or radar manipulation.</li><li>Military-style drills: Adopting MCON (Emission Control) exercises — used by the U.S. Military Sealift Command — can train crews to operate safely without electronic systems.</li><li>Stronger connectivity controls: High-bandwidth satellite systems like Starlink introduce new risks. Clear policies and network segregation are essential to prevent new entry points for attackers.</li></ul><blockquote><h6>Related Article: <a href="https://theconversation.com/when-gps-lies-at-sea-how-electronic-warfare-is-threatening-ships-and-their-crews-278181"><strong>When GPS lies at sea: How electronic warfare is threatening ships and their&nbsp;crews</strong></a><strong> by Anna Raymaker</strong></h6></blockquote><h6>3. Create Unified, Ship-Specific Regulations</h6><p>Maritime cybersecurity regulations are often reactive and fragmented. Researchers argue the industry needs a cohesive, domain-specific framework.</p><p>Key recommendations include:</p><ul><li>A unified global model: Like the energy sector’s NERC CIP standards, a maritime framework could mandate baseline controls such as encryption, network segmentation, and anonymous incident reporting.</li><li>Rules built for real crews: Regulations designed for large naval operations don’t translate well to smaller merchant or research vessels. Standards must reflect actual shipboard conditions.</li><li>Future-proofing requirements: Autonomous ships and remotely operated vessels expand the cyber-physical attack surface. Regulations must proactively address these emerging technologies.</li></ul><h6>4. Invest in Maritime-Specific Cyber Research</h6><p>Finally, the researchers stress that long-term resilience requires deeper technical research focused on maritime systems.</p><p>Priority areas include:</p><ul><li>Real-time intrusion detection systems tailored to shipboard protocols.</li><li>Proactive security risk assessments of interconnected onboard systems.</li><li>Cyber-physical modeling to better understand cascading failures in complex maritime environments.</li></ul><h6>The Bottom Line</h6><p>Cyber threats at sea are no longer hypothetical. Mariners report real-world incidents ranging from GPS spoofing to ransomware that disrupts global trade.</p><p>“Through our interviews with mariners, I saw firsthand how much dedication and pride they take in their work,” said Raymaker. “Our goal is for this research to serve as a call to action for researchers, policymakers, and industry to invest more attention in maritime cybersecurity and support the people who risk their lives every day to keep global trade, food, and energy moving."</p><p><a href="https://dl.acm.org/doi/10.1145/3719027.3744816"><em>A Sea of Cyber Threats: Maritime Cybersecurity from the Perspective of Mariners</em></a><em>&nbsp;</em>was presented at CCS 2025. It was written by Raymaker and her colleagues, Ph.D. students <strong>Akshaya Kumar</strong>, <strong>Miuyin Yong Wong</strong>, and <strong>Ryan Pickren</strong>; Research Scientist <strong>Animesh Chhotaray</strong>, Associate Professor <strong>Frank Li,</strong> Associate Professor <strong>Saman Zonouz</strong>, and Georgia Tech Provost and Executive Vice President for Academic Affairs <strong>Raheem Beyah</strong>.</p>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1774457240</created>  <gmt_created>2026-03-25 16:47:20</gmt_created>  <changed>1774461690</changed>  <gmt_changed>2026-03-25 18:01:30</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Research from the Georgia Institute of Technology shows that commercial ship crews are often unprepared for cyberattacks due to inadequate, generic training, despite rising threats like GPS spoofing and ransomware.]]></teaser>  <type>news</type>  <sentence><![CDATA[Research from the Georgia Institute of Technology shows that commercial ship crews are often unprepared for cyberattacks due to inadequate, generic training, despite rising threats like GPS spoofing and ransomware.]]></sentence>  <summary><![CDATA[<p>Research from the Georgia Institute of Technology shows that commercial ship crews are often unprepared for cyberattacks due to inadequate, generic training, despite rising threats like GPS spoofing and ransomware. Because ships must handle incidents independently at sea, researchers recommend more practical, maritime-specific training, stronger onboard response plans, unified global cybersecurity regulations, and increased investment in ship-focused cyber research. These steps are critical to protecting maritime operations, which carry over 90% of global trade.</p>]]></summary>  <dateline>2026-03-25T00:00:00-04:00</dateline>  <iso_dateline>2026-03-25T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-03-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jpopham3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Popham</p><p>Communications Officer II&nbsp;School of Cybersecurity and Privacy&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679738</item>      </media>  <hg_media>          <item>          <nid>679738</nid>          <type>image</type>          <title><![CDATA[Cyber Navy]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[AdobeStock_1936842040.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/25/AdobeStock_1936842040.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/25/AdobeStock_1936842040.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/25/AdobeStock_1936842040.jpeg?itok=7woleQVR]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A graphic of a boat sailing across the globe with a cyber shield at its front. ]]></image_alt>                    <created>1774461240</created>          <gmt_created>2026-03-25 17:54:00</gmt_created>          <changed>1774461240</changed>          <gmt_changed>2026-03-25 17:54:00</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>          <term tid="39461"><![CDATA[Manufacturing, Trade, and Logistics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689184">  <title><![CDATA[Cybersecurity and Privacy Faculty Earns Promotion and Tenure]]></title>  <uid>36253</uid>  <body><![CDATA[<p>The Georgia Institute of Technology recently announced that <strong>FrankLi</strong> has been promoted from Assistant Professor to Associate Professor and has been granted tenure.</p><p>Li, an accomplished computer security and privacy researcher, joined Georgia Tech in 2020 as the Institute was launching the School of Cybersecurity and Privacy (SCP). He holds a joint appointment with the School of Electrical and Computer Engineering (ECE).&nbsp;</p><p>“While tenure may be an individual's milestone, in reality, it reflects the help, support, and hard work of countless others,” Li said.</p><p>He credits his accomplishments to the ongoing mentorship and support he has received from faculty and staff at SCP, ECE, and Georgia Tech.</p><p>“I'm also extremely thankful to work with such amazing students at Georgia Tech, especially the Ph.D. students in my research lab, and the BS and MS students in my classes, who help our research efforts. Georgia Tech has been an amazing place to start my faculty career,” said Li.</p><p>Li advises five Ph.D. students at his Better Empirically Established Security (<a href="https://faculty.cc.gatech.edu/~frankli/beeslab.html">BEES</a>) lab in SCP. They take a data-driven approach to understanding how security and privacy concerns manifest in practice, and use the insights gained to drive improvements in real-world security.</p><p>Their research examines how users, security operators, and attackers behave in various security and privacy-sensitive situations, often using internet-wide measurements, network traffic analysis, user studies and experiments, and large-scale data mining.</p><p>“The tenure and promotion to associate professor rank is in recognition of the outstanding research program Frank has developed at SCP,” said <strong>Mustaque</strong> <strong>Ahamad</strong>, interim chair and Regents’ Entrepreneur.</p><p>“He is an award-winning educator. We look forward to his continued leadership in the important areas of usable security and network security in the future.”</p><p>Li was among nine College of Computing faculty members who received promotion and <a href="https://www.cc.gatech.edu/news/institute-announcement-recognizes-faculty-achievement-and-excellence">tenure this year</a>.</p><p>John P. Imlay Jr. Dean of Computing <strong>Vivek</strong> <strong>Sarkar</strong> emailed the College community with the good news.</p><p>“We are truly thrilled to celebrate this moment with you, as we recognize your contributions to our students and to the advancement of our College and Institute in so many ways,” he said.</p><p>In 2025, Li received the prestigious <a href="https://www.cc.gatech.edu/news/new-research-will-move-us-closer-passwordless-society">CAREER Award</a> from the National Science Foundation (NSF). His CAREER project will investigate real-world uses of FIDO2/passkeys and address security and usability issues that can arise. A goal of his research is to identify and resolve problems before they become widespread and more difficult to solve.</p>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1774456890</created>  <gmt_created>2026-03-25 16:41:30</gmt_created>  <changed>1774456962</changed>  <gmt_changed>2026-03-25 16:42:42</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Institute of Technology promoted Frank Li to associate professor with tenure, recognizing his impactful research and teaching since joining in 2020 in the School of Cybersecurity and Privacy and ECE. Li leads the BEES Lab, where he and his student]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Institute of Technology promoted Frank Li to associate professor with tenure, recognizing his impactful research and teaching since joining in 2020 in the School of Cybersecurity and Privacy and ECE. Li leads the BEES Lab, where he and his student]]></sentence>  <summary><![CDATA[<p><strong>Georgia Institute of Technology</strong> promoted <strong>Frank Li</strong> to associate professor with tenure, recognizing his impactful research and teaching since joining in 2020 in the School of Cybersecurity and Privacy and ECE. Li leads the BEES Lab, where he and his students use data-driven methods to study real-world security and privacy challenges, including user behavior and network activity, to improve practical systems. Praised for his leadership in usable and network security, he was also among nine faculty honored this year and received a 2025 CAREER Award from the <strong>National Science Foundation</strong> to study FIDO2/passkeys and address emerging security and usability issues.</p>]]></summary>  <dateline>2026-03-25T00:00:00-04:00</dateline>  <iso_dateline>2026-03-25T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-03-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jpopham3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Popham&nbsp;</p><p>Communications Officer II&nbsp;School of Cybersecurity and Privacy&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679737</item>      </media>  <hg_media>          <item>          <nid>679737</nid>          <type>image</type>          <title><![CDATA[Frank-Li-Story-Graphic-web-copy.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Frank-Li-Story-Graphic-web-copy.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/25/Frank-Li-Story-Graphic-web-copy.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/25/Frank-Li-Story-Graphic-web-copy.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/25/Frank-Li-Story-Graphic-web-copy.jpg?itok=bIVE2C_Z]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A graphic showing Frank Li's promotion to associate professor. ]]></image_alt>                    <created>1774456919</created>          <gmt_created>2026-03-25 16:41:59</gmt_created>          <changed>1774456919</changed>          <gmt_changed>2026-03-25 16:41:59</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="689007">  <title><![CDATA[New Mobile App Turns Phones into At-Home Fetal Heart Monitors]]></title>  <uid>36530</uid>  <body><![CDATA[<div><p>A new mobile app will soon put the ability to monitor a baby’s prenatal heartbeat in the hands of pregnant women who may worry about their baby’s health in between doctor’s visits.&nbsp;</p></div><div><p>Studies show that one in five pregnant women experiences <a href="https://theconversation.com/perinatal-anxiety-one-in-five-women-experience-it-but-many-still-suffer-alone-before-or-after-childbirth-133667" rel="noreferrer noopener" target="_blank">perinatal anxiety</a>, which is characterized by intense negative thoughts about their pregnancy.&nbsp;</p></div><div><p>DopFone turns any smartphone speaker into a Doppler radar by emitting a low-pitched ultrasound and detecting reflected signals of abdominal surface vibrations caused by a fetal heartbeat.&nbsp;</p></div><div><p><a href="https://www.alexandertadams.com/" rel="noreferrer noopener" target="_blank"><strong>Alex Adams</strong></a>, an assistant professor in Georgia Tech’s School of Interactive Computing, said he came up with the idea for DopFone as he and his wife, Elise, experienced two miscarriages. At the time, she couldn’t reliably measure the fetal heart rate with a standard fetal Doppler monitor.&nbsp;</p></div><div><p>Those experiences exposed gaps in the maternal healthcare process.&nbsp;</p></div><div><p>“There are a lot of great devices in hospitals and clinics, but there’s not much outside of those venues, even for high-risk pregnancies,” Adams said. “This is about filling the gaps between checkups.”&nbsp;</p></div><div><p><a href="https://www.poojitagarg.com/" rel="noreferrer noopener" target="_blank"><strong>Poojita Garg</strong></a> joined Adams to work on DopFone while completing her master’s degree at Georgia Tech. She is now pursuing her Ph.D. at the University of Washington and is co-advised by Professor Swetak Patel, who earned his Ph.D. from Georgia Tech in 2008.&nbsp;</p></div><div><p>Garg is working with the University of Washington School of Medicine to conduct DopFone’s first clinical trials.&nbsp;</p></div><div><p>Garg tested DopFone on 23 patients and achieved a plus-minus of 4.9 beats per minute, well within the clinical standard range of eight beats per minute for reliable fetal heart rate measurement.&nbsp;</p></div><div><p>Adams said it measured within two beats per minute in most cases, with an error rate of less than one percent.&nbsp;</p></div><div><p>About one million pregnancies in the U.S. end in miscarriage, <a href="https://medicine.yale.edu/news-article/dr-harvey-kliman-study-finds-the-placenta-holds-answers-to-many-unexplained-pregnancy-losses/" rel="noreferrer noopener" target="_blank">according to a study from the Yale School of Medicine</a>, and doctors know little about what causes them. Adams said that number is probably higher because many go unreported.&nbsp;</p></div><div><p>Adams and Garg said it’s unclear whether the innovation could reduce the number of miscarriages. However, consistent fetal heart rate data collection outside of the doctor’s office could provide a better idea of what happens leading up to a miscarriage.&nbsp;</p></div><div><p>“From there, we can take preventative action,” Adams said. “If nothing else, we can give a sense of comfort to those who may be worried.”&nbsp;</p></div><div><p><strong>Expanding Access</strong>&nbsp;</p></div><div><p>While couples can purchase portable fetal heart rate monitors, Adams and Garg see DopFone as a low-cost alternative for those who live in areas with limited or inaccessible healthcare systems.&nbsp;&nbsp;</p></div><div><p>“There’s a lot of potential for using it in what doctors like to call maternity deserts,” Garg said. “These are areas where a pregnant person, at the time of delivery, would have to travel long distances to reach a hospital. This technology will be useful globally in underdeveloped areas of the world.”&nbsp;</p></div><div><p>The researchers also mentioned that external add-ons and attachments aren’t part of their design goals. They prefer to rely on the phone’s built-in features to keep the technology accessible.&nbsp;</p></div><div><p>“The real value is that 96% of America already has the technology in their pocket, along with 60% of the world’s population,” Adams said. “Half of the battle is having the right tools. The more we can get from what’s already in the phone, the more we can guarantee people have access to it.”&nbsp;</p></div><div><p><strong>Not a Substitute</strong>&nbsp;</p></div><div><p>Some patients may feel a constant need to check their unborn child’s heart rate, and Garg acknowledged that a tool like DopFone could increase that anxiety. She and Adams said a future version of the app will tell the parent if the heart rate is within a healthy range.&nbsp;</p></div><div><p>“There’s a lot of tradeoffs between a tool that could provide reassurance or create anxiety,” she said. “We want the use of this tool to be recommended by a doctor and for doctors and their care teams to be kept in the loop.”&nbsp;</p></div><div><p>She also said DopFone is not meant to replace anything that is done in a clinic.&nbsp;</p></div><div><p>“There are devices that make the whole process possible at home, but this is something that should be done in a clinic, so that’s the line we want to draw,” she said.&nbsp;&nbsp;</p></div>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1773840199</created>  <gmt_created>2026-03-18 13:23:19</gmt_created>  <changed>1774271766</changed>  <gmt_changed>2026-03-23 13:16:06</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new app will allow pregnant women to conduct an ultrasound and receive an accurate fetal heart rate from their mobile phones.]]></teaser>  <type>news</type>  <sentence><![CDATA[A new app will allow pregnant women to conduct an ultrasound and receive an accurate fetal heart rate from their mobile phones.]]></sentence>  <summary><![CDATA[<p>DopFone uses smartphone speakers to emit a low-pitched ultrasound that detects reflected signals of abdominal surface vibrations caused by fetal cardiac activity.</p><p><a href="https://www.alexandertadams.com/"><strong>Alex Adams</strong></a>, an assistant professor in Georgia Tech’s School of Interactive Computing, said he came up with the idea for DopFone as he and his wife, Elise, suffered through two miscarriages.</p><p><a href="https://www.poojitagarg.com/"><strong>Poojita Garg</strong></a> joined Adams to work on DopFone while completing her master’s at Georgia Tech. She is now pursuing her Ph.D. at the University of Washington and is co-advised by Professor Swetak Patel, who earned his Ph.D. from Georgia Tech in 2008.</p><p>Garg is working with the University of Washington School of Medicine to conduct DopFone’s first clinical trials.</p><p>Garg tested DopFone on 23 patients and achieved a plus-minus of 4.9 beats per minute, well within the clinical standard for reliable fetal heart rate measurement of plus-minus 8 beats per minute.</p>]]></summary>  <dateline>2026-03-18T00:00:00-04:00</dateline>  <iso_dateline>2026-03-18T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-03-18 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679666</item>      </media>  <hg_media>          <item>          <nid>679666</nid>          <type>image</type>          <title><![CDATA[DopFone-PR-Photo-with-blur.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[DopFone-PR-Photo-with-blur.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/18/DopFone-PR-Photo-with-blur.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/18/DopFone-PR-Photo-with-blur.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/18/DopFone-PR-Photo-with-blur.jpg?itok=onZXN-9m]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Woman holds mobile phone to the belly of a pregnant woman]]></image_alt>                    <created>1773840209</created>          <gmt_created>2026-03-18 13:23:29</gmt_created>          <changed>1773840209</changed>          <gmt_changed>2026-03-18 13:23:29</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="181431"><![CDATA[maternal]]></keyword>          <keyword tid="7677"><![CDATA[ultrasound]]></keyword>          <keyword tid="34741"><![CDATA[mobile app]]></keyword>          <keyword tid="29561"><![CDATA[pregnancy]]></keyword>          <keyword tid="190383"><![CDATA[pregnant women]]></keyword>          <keyword tid="168908"><![CDATA[smartphone]]></keyword>          <keyword tid="188420"><![CDATA[babies]]></keyword>          <keyword tid="178046"><![CDATA[fetal monitoring]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71891"><![CDATA[Health and Medicine]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="688391">  <title><![CDATA[Robot Pollinator Could Produce More, Better Crops for Indoor Farms]]></title>  <uid>36530</uid>  <body><![CDATA[<p>A new robot could solve one of the biggest challenges facing indoor farmers: manual pollination.</p><p>Indoor farms, also known as vertical farms, are popular among agricultural researchers and are expanding across the agricultural industry. Some benefits they have over outdoor farms include:</p><ul><li>Year-round production of food crops</li><li>Less water and land requirements</li><li>Not needing pesticides</li><li>Reducing carbon emissions from shipping</li><li>Reducing food waste</li></ul><p>Additionally,&nbsp;<a href="https://www.agritecture.com/blog/2021/7/20/5-ways-vertical-farming-is-improving-nutrition"><strong>some studies</strong></a> indicate that indoor farms produce more nutritious food for urban communities.&nbsp;</p><p>However, these farms are often inaccessible to birds, bees, and other natural pollinators, leaving the pollination process to humans. The tedious process must be completed by hand for each flower to ensure the indoor crop flourishes.</p><p><a href="https://research.gatech.edu/people/ai-ping-hu"><strong>Ai-Ping Hu</strong></a>, a principal research engineer at the Georgia Tech Research Institute (GTRI), has spent years exploring methods to efficiently pollinate flowering plants and food crops in indoor farms to find a way to efficiently pollinate flower plants and food crops in indoor farms.</p><p>Hu,&nbsp;<a href="https://research.gatech.edu/people/shreyas-kousik"><strong>Assistant Professor Shreyas Kousik of the George W. Woodruff School of Mechanical Engineering</strong></a>, and a rotating group of student interns have developed a robot prototype that may be up to the task.</p><p>The robot can efficiently pollinate plants that have both male and female reproductive parts. These plants only require pollen to be transferred from one part to the other rather than externally from another flower.</p><p>Natural pollinators perform this task outdoors, but Hu said indoor farmers often use a paintbrush or electric tootbrush to ensure these flowers are pollinated.&nbsp;</p><h4><strong>Knowing the Pose</strong></h4><p>An early challenge the research team addressed was teaching the robot to identify the “pose” of each flower. Pose refers to a flower’s orientation, shape, and symmetry. Knowing these details ensures precise delivery of the pollen to maximize reproductive success.&nbsp;</p><p>“It’s crucial to know exactly which way the flowers are facing,” Hu said.</p><p>“You want to approach the flower from the front because that’s where all the biological structures are. Knowing the pose tells you where the stem is. Our device grasps the stem and shakes it to dislodge the pollen.</p><p>“Every flower is going to have its own pose, and you need to know what that is within at least 10 degrees.”</p><h4><strong>Computer Vision Breakthrough</strong></h4><p><strong>Harsh Muriki</strong> is a robotics master’s student at Georgia Tech’s School of Interactive Computing, who used computer vision to solve the pose problem while interning for Hu and GTRI.</p><p>Muriki attached a camera to a FarmBot to capture images of strawberry plants from dozens of angles in a small garden in front of Georgia Tech’s Food Processing Technology Building. The&nbsp;<a href="https://farm.bot/?srsltid=AfmBOoqh1Z8vSs3WflZisgw5DsOUSo8shD4VtY0Y8_VmVpVyt0Iwalxo"><strong>FarmBot</strong></a> is an XYZ-axis robot that waters and sprays pesticides on outdoor gardens, though it is not capable of pollination.</p><p>“We reconstruct the images of the flower into a 3D model and use a technique that converts the 3D model into multiple 2D images with depth information,” Muriki said. “This enables us to send them to object detectors.”</p><p>Muriki said he used a real-time object detection system called YOLO (You Only Look Once) to classify objects. YOLO is known for identifying and classifying objects in a single pass.</p><p><strong>Ved Sengupta</strong>, a computer engineering major who interned with Muriki, fine-tuned the algorithms that converted 3D images into 2D.</p><p>“This was a crucial part of making robot pollination possible,” Sengupta said. “There is a big gap between 3D and 2D image processing.</p><p>“There’s not a lot of data on the internet for 3D object detection, but there’s a ton for 2D. We were able to get great results from the converted images, and I think any sector of technology can take advantage of that.”</p><p>Sengupta, Muriki, and Hu co-authored a paper about their work that was accepted to the 2025 International Conference on Robotics and Automation (ICRA) in Atlanta.</p><h4><strong>Measuring Success</strong></h4><p>The pollination robot, built in Kousik’s Safe Robotics Lab, is now in the prototype phase.&nbsp;</p><p>Hu said the robot can do more than pollinate. It can also analyze each flower to determine how well it was pollinated and whether the chances for reproduction are high.</p><p>“It has an additional capability of microscopic inspection,” Hu said. “It’s the first device we know of that provides visual feedback on how well a flower was pollinated.”</p><p>For more information about the robot, visit the&nbsp;<a href="https://saferoboticslab.me.gatech.edu/research/towards-robotic-pollination/"><strong>Safe Robotics Lab project page</strong></a>.</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1771527492</created>  <gmt_created>2026-02-19 18:58:12</gmt_created>  <changed>1774011241</changed>  <gmt_changed>2026-03-20 12:54:01</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A research team that expands GTRI, the College of Engineering, and the College of Computing have developed a robot capable of pollinating flowers in indoor farms.]]></teaser>  <type>news</type>  <sentence><![CDATA[A research team that expands GTRI, the College of Engineering, and the College of Computing have developed a robot capable of pollinating flowers in indoor farms.]]></sentence>  <summary><![CDATA[<p>Manual pollination is one of the biggest challenges for indoor farmers. These farms are often inaccessible to birds, bees, and other natural pollinators, leaving the pollination process to humans. The tedious process must be completed by hand for each flower to ensure the indoor crop flourishes.</p><p>A Georgia Tech research led by Ai-Ping Hu and Shreyas Kousik team is working to solve that. A robot they've developed can efficiently pollinate plants that have both male and female reproductive parts. These plants only require pollen to be transferred from one part to the other rather than externally from another flower.</p>]]></summary>  <dateline>2026-02-19T00:00:00-05:00</dateline>  <iso_dateline>2026-02-19T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-02-19 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:ndeen6@gatech.edu">Nathan Deen</a><br>College of Computing<br>Georgia Tech</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679370</item>      </media>  <hg_media>          <item>          <nid>679370</nid>          <type>image</type>          <title><![CDATA[Harsh-Muriki_86A0006.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Harsh-Muriki_86A0006.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/02/19/Harsh-Muriki_86A0006.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/02/19/Harsh-Muriki_86A0006.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/02/19/Harsh-Muriki_86A0006.jpg?itok=WJg8YQi9]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Harsh Muriki]]></image_alt>                    <created>1771527500</created>          <gmt_created>2026-02-19 18:58:20</gmt_created>          <changed>1771527500</changed>          <gmt_changed>2026-02-19 18:58:20</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187991"><![CDATA[go-robotics]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="11506"><![CDATA[computer vision]]></keyword>          <keyword tid="180840"><![CDATA[computer vision systems]]></keyword>          <keyword tid="669"><![CDATA[agriculture]]></keyword>          <keyword tid="194392"><![CDATA[AI in Agriculture]]></keyword>          <keyword tid="170254"><![CDATA[urban gardening]]></keyword>          <keyword tid="94111"><![CDATA[farming]]></keyword>          <keyword tid="14913"><![CDATA[urban farming]]></keyword>          <keyword tid="23911"><![CDATA[bees]]></keyword>          <keyword tid="6660"><![CDATA[flowers]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="193653"><![CDATA[Georgia Tech Research Institute]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71911"><![CDATA[Earth and Environment]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="688478">  <title><![CDATA[Student Getting Research Boost Through Google Ph.D. Fellowship]]></title>  <uid>36530</uid>  <body><![CDATA[<p>A Georgia Tech Ph.D. candidate is getting a boost to his research into developing more efficient multi-tasking artificial intelligence (AI) models without fine-tuning.</p><p>Georgia Stoica is one of 38 Ph.D. students worldwide researching machine learning who were named a<a href="https://research.google/programs-and-events/phd-fellowship/recipients/"><strong> 2025 Google Ph.D. Fellow</strong></a>.</p><p>Stoica is designing AI training methods that bypass fine-tuning, which is the process of adapting a large pre-trained model to perform new tasks. Fine-tuning is one of the most common ways engineers update large-language models like ChatGPT, Gemini, and Claude to add new capabilities.&nbsp;</p><p>If an AI company wants to give a model a new capability, it could create a new model from scratch for that specific purpose. However, if the model already has relevant training and knowledge of the new task, fine-tuning is cheaper.</p><p>Stoica argues that fine-tuning still uses large amounts of data, and that other methods can help models learn more effectively and efficiently.</p><p>“Full fine-tuning yields strong performance, but it can be costly, and it risks catastrophic forgetting,” Stoica said. “My research asks if we can extend a model’s capabilities by imbuing it with the expertise of others, without fine-tuning?</p><p>“Reducing cost and improving efficiency is more important than ever. We have so many publicly available models that have been trained to solve a variety of tasks. It’s redundant to train a new model from scratch. It’s much more efficient to leverage the information that already exists to get a model up to speed.”</p><p>Stoica said the solution is a cost-effective method called model merging. This method combines two or more AI models into a single model, improving performance without fine-tuning.</p><p>On a basic level, Stoica said an example would be combining a model that is efficient at classifying cats with one that works well at dogs.</p><p>“Merging is cheap because you just take the parameters, the weights of your existing models, and combine them,” he said. “You could take the average of the weights to create a new model, but that sometimes doesn’t work. My work has aimed to rearrange the weights so they can communicate easily with each other.”</p><p>Through his Google fellowship, Stoica seeks to apply model merging to create a cutting-edge vision encoder. A vision encoder converts image or video data into numerical representations that computers can understand. This enables tasks such as image or facial recognition and generative image captioning.</p><p>“I want to be at the frontier of the field, and Google is clearly part of that,” Stoica said. “The vision encoder is very large-scale, and Google has the infrastructure to accommodate it.”</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1771868634</created>  <gmt_created>2026-02-23 17:43:54</gmt_created>  <changed>1774011185</changed>  <gmt_changed>2026-03-20 12:53:05</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Stoica is one of 38 Ph.D. students worldwide researching machine learning who were named a 2025 Google Ph.D. Fellow.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Stoica is one of 38 Ph.D. students worldwide researching machine learning who were named a 2025 Google Ph.D. Fellow.]]></sentence>  <summary><![CDATA[<p>Georgia Stoica is one of 38 Ph.D. students worldwide researching machine learning who were named a<a href="https://research.google/programs-and-events/phd-fellowship/recipients/"><strong> 2025 Google Ph.D. Fellow</strong></a>.</p><p>Stoica is designing AI training methods that bypass fine-tuning, which is the process of adapting a large pre-trained model to perform new tasks. Fine-tuning is one of the most common ways engineers update large-language models like ChatGPT, Gemini, and Claude to add new capabilities.&nbsp;</p>]]></summary>  <dateline>2026-02-23T00:00:00-05:00</dateline>  <iso_dateline>2026-02-23T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-02-23 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679394</item>      </media>  <hg_media>          <item>          <nid>679394</nid>          <type>image</type>          <title><![CDATA[IMG_2942-copy-2.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[IMG_2942-copy-2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/02/23/IMG_2942-copy-2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/02/23/IMG_2942-copy-2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/02/23/IMG_2942-copy-2.jpg?itok=uDAIb90H]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[George Stoica]]></image_alt>                    <created>1771868657</created>          <gmt_created>2026-02-23 17:44:17</gmt_created>          <changed>1771868657</changed>          <gmt_changed>2026-02-23 17:44:17</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>      </news_terms>  <keywords>          <keyword tid="3165"><![CDATA[google]]></keyword>          <keyword tid="9143"><![CDATA[Graduate Research Fellowship]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="688487">  <title><![CDATA[New Study Could Show How TikTok’s Algorithm Affects Youth Mental Health]]></title>  <uid>36530</uid>  <body><![CDATA[<div><div><p>Meta CEO Mark Zuckerberg&nbsp;<a href="https://www.latimes.com/california/story/2026-02-18/mark-zuckerberg-tesimony-la-social-media-trial?utm_source=chatgpt.com"><strong>took the witness stand</strong></a> last week in Los Angeles County Superior Court to defend his company from accusations that social media harms children.</p><p>A lawsuit filed by a 20-year-old plaintiff alleges Instagram and other social media apps are designed to make young users addicted to their platforms.</p><p>Meanwhile, social media experts believe the algorithms that drive content on these platforms play a role in hooking users and keeping them scrolling for extensive periods of time.</p><p>A new study led by Georgia Tech might confirm this suspicion.</p><p>Using recently acquired data from more than 10,000 adolescent users,&nbsp;<a href="http://www.munmund.net/"><strong>Munmun De Choudhury</strong></a> will audit TikTok’s recommendation algorithm and study its impact on young people’s behavior and mental health.</p><p>De Choudhury is leading a multi-institutional research team on a four-year, $1.7 million grant from the Huo Family Foundation.</p><p>“We hope to learn the different types of negative exposures that young people experience when using TikTok,” De Choudhury said. “This can help us characterize what they’re watching and build computational methods to understand the consumption behaviors of these participants and how they’re affected by the algorithm.”</p><p>De Choudhury, a professor in Georgia Tech’s School of Interactive Computing, is collaborating with Amy Orben, a professor at the University of Cambridge, and Homa Hosseinmardi, an assistant professor at UCLA, on the project.</p><p>Social media platforms have become increasingly reluctant to share their data in recent years, posing a challenge for researchers like De Choudhury.</p><p>“We can’t do the type of studies we did 10 years ago with X (formerly Twitter) because the API is much more restrictive,” she said. “There are limited ways to programmatically access people’s data now.</p><p>“We must go through a tedious, manual process to get around declining access to social media data. This data-gathering process is essential given the sensitive nature of mental health research. You want data that is shared with consent.”</p><p>Orben collected TikTok data from more than 10,000 young people in the UK who consented to provide their personal data archives in accordance with the European Union’s General Data Protection Regulation (GDPR).</p><p>The collected data includes watch histories, which De Choudhury said distinguishes this research from other social media studies that focus on what users post.</p><p>“We don’t understand passive social media consumption very well, so we hope to close that gap and learn what that looks like,” she said. “That could complement or contrast what we know about people’s active engagement on these platforms. Is what they’re consuming directly related to what they’re posting? How does passive consumption affect young people’s mental health?”</p><p>A clearer picture of how algorithm-based content affects young people could result in design interventions to minimize negative effects. De Choudhury said studying data from young people is critical because it’s not too late to steer them away from unhealthy behavioral patterns.</p><p>“Some of the earliest signs or symptoms of mental health conditions appear in adolescence,” she said. “If appropriate care and support are provided, maybe it’s possible to prevent these symptoms from becoming full-blown in the future.”</p><h4><strong>Beyond TikTok</strong></h4><p>What the research team learns about TikTok could also provide broader insight into other social media platforms.</p><p>TikTok has been influential in how social media platforms display video content. Competitors like Instagram and X modeled their video presentation after TikTok’s, which can easily lead to doomscrolling.</p><p>“Our hope is that our findings can be generalized, with the caveat the data we have is exclusively from TikTok,” De Choudhury said. “Other platforms have similar video-sharing and consumption features where the video automatically plays from one to the next. We hope what we learn from TikTok will be applicable to people’s activities elsewhere, though it will require future work beyond this project to draw concrete conclusions.”</p><h4><strong>Simulating Feeds with AI</strong></h4><p>De Choudhury said an additional part of the study will be using artificial intelligence (AI) to simulate video feeds.</p><p>In 2024, Hosseinmardi led a study at the University of Pennsylvania on YouTube’s recommendation algorithm and used bots that either followed or ignored the recommendations.</p><p>De Choudhury said they will use a similar method for TikTok.</p><p>“The feeds will be realistic but generated by AI to see the potential pathways to consumption rabbit holes,” she said. “This should give us some insight into how algorithms influence the negative and positive exposures people might be having on TikTok.”</p><h4><strong>Foundation Expands Reach</strong></h4><p>Based in the UK and established in 2009, the Huo Family Foundation supports community education initiatives in the UK, the U.S., and China.</p><p>The organization announced in January its launch of the Huo Family Foundation Science Programme.&nbsp;<a href="https://huofamilyfoundation.org/news/updates/huo-family-foundation-awards-17-6m-for-groundbreaking-research/"><strong>The new program is committing $17.6 million to fund 20 new multi-year research grants</strong></a> that explore the impact of digital technology on the brain development, social behavior, and mental health of young people.</p><p>“Digital technology is profoundly shaping childhood and young adulthood, yet there is limited causal evidence of its effects,”&nbsp;said Yan Huo, founder of the Huo Family Foundation, in a press release.&nbsp;“We are proud to support exceptional researchers advancing vital scientific understanding.”</p></div></div>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1771943368</created>  <gmt_created>2026-02-24 14:29:28</gmt_created>  <changed>1774011172</changed>  <gmt_changed>2026-03-20 12:52:52</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A Georgia Tech-led research team is conducting a multi-year study using data from more than 10,000 adolescents to investigate how TikTok’s recommendation algorithm and passive content consumption impact youth mental health.]]></teaser>  <type>news</type>  <sentence><![CDATA[A Georgia Tech-led research team is conducting a multi-year study using data from more than 10,000 adolescents to investigate how TikTok’s recommendation algorithm and passive content consumption impact youth mental health.]]></sentence>  <summary><![CDATA[<div><div dir="ltr"><p>Led by Georgia Tech professor Munmun De Choudhury, a multi-institutional research team is launching a $1.7 million study to examine how TikTok’s recommendation algorithm influences the mental health of adolescent users. The project focuses on passive consumption by analyzing the watch histories of over 10,000 young participants and using AI to simulate content "rabbit holes." By identifying patterns of negative exposure, the researchers aim to develop design interventions that can steer teenagers away from unhealthy behavioral patterns and support early mental health care.</p></div></div>]]></summary>  <dateline>2026-02-24T00:00:00-05:00</dateline>  <iso_dateline>2026-02-24T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-02-24 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679406</item>      </media>  <hg_media>          <item>          <nid>679406</nid>          <type>image</type>          <title><![CDATA[208A9267-2.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[208A9267-2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/02/24/208A9267-2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/02/24/208A9267-2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/02/24/208A9267-2.jpg?itok=EzUbj3qp]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Munmun De Choudhury]]></image_alt>                    <created>1771943377</created>          <gmt_created>2026-02-24 14:29:37</gmt_created>          <changed>1771943377</changed>          <gmt_changed>2026-02-24 14:29:37</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="143"><![CDATA[Digital Media and Entertainment]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="143"><![CDATA[Digital Media and Entertainment]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="167543"><![CDATA[social media]]></keyword>          <keyword tid="190947"><![CDATA[tiktok]]></keyword>          <keyword tid="10343"><![CDATA[mental health]]></keyword>          <keyword tid="10824"><![CDATA[Children And Adolescents]]></keyword>          <keyword tid="5660"><![CDATA[algorithms]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71901"><![CDATA[Society and Culture]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="688516">  <title><![CDATA[ Is This Your AI? Researchers Crack AI Blackbox]]></title>  <uid>36253</uid>  <body><![CDATA[<div><div><p>Artificial intelligence (AI) systems power everything from chatbots to security cameras, yet many of the most advanced models operate as “black boxes.” Companies can use them, but outsiders can’t see how they were built, where they came from, or whether they contain hidden flaws.</p><p>This lack of transparency creates real risks. A model could contain security vulnerabilities or hidden backdoors. It could also be a lightly modified version of an open-source system — repackaged in violation of its license — with no easy way to prove it.</p><p>Researchers at the Georgia Institute of Technology have developed a new framework, ZEN, to help solve this problem. The tool can recover a model’s unique “fingerprint” directly from its memory, allowing experts to trace its origins and reconstruct how it was assembled.</p><p>“Analyzing a proprietary AI model without identifying where it came from and how it is constructed is like trying to fix a car engine with the hood welded shut,” said <a href="https://davidoygenblik.github.io/"><strong>David Oygenblik</strong></a>, a Ph.D. student at Georgia Tech and the study’s lead author.</p><p>“ZEN not only X-rays the engine but also provides the complete wiring diagram.”</p><p>ZEN works by taking a snapshot of a running AI system and extracting information about both its mathematical structure and the code that defines it. It compares that fingerprint against a database of known open-source models to determine the system’s origin.</p><p>If it finds a match, ZEN identifies the exact changes and generates software patches that allow investigators to recreate a working replica of the proprietary model for testing.</p><p>That capability has major implications for both security and intellectual property protection.</p><p>“With ZEN, a security analyst can finally test a black-box model for hidden backdoors, and a company can gather concrete evidence to prove its software license was infringed,” Oygenblik said.</p><p>To evaluate the system, the research team tested ZEN on 21 state-of-the-art AI models, including Llama 3, YOLOv10, and other well-known systems.</p><p>ZEN correctly traced every customized model back to its original open-source foundation — achieving 100% attribution accuracy. Even when models had been heavily modified — differing by more than 83% from their original versions — ZEN successfully identified the changes and enabled full reconstruction for security testing.</p><p>The researchers will present their findings at the 2026 <a href="https://www.ndss-symposium.org/">Network and Distributed System Security (NDSS) Symposium</a>. The paper, <a href="https://www.ndss-symposium.org/ndss-paper/achieving-zen-combining-mathematical-and-programmatic-deep-learning-model-representations-for-attribution-and-reuse/"><em>Achieving Zen: Combining Mathematical and Programmatic Deep Learning Model Representations for Attribution and Reuse</em></a>, was authored by Oygenblik, master’s student <strong>Dinko Dermendzhiev</strong>, Ph.D. students <strong>Filippos Sofias</strong>, <strong>Mingxuan Yao</strong>, <strong>Haichuan Xu</strong>, and <strong>Runze Zhang</strong>, post-doctorate scholars <strong>Jeman Park</strong>, and <strong>Amit Kumar Sikder</strong>, as well as Associate Professor <strong>Brendan Saltaformaggio</strong>.</p></div></div>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1772040800</created>  <gmt_created>2026-02-25 17:33:20</gmt_created>  <changed>1774011162</changed>  <gmt_changed>2026-03-20 12:52:42</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have developed a technique to identify the origins of proprietary “black-box” AI models, even when their internal structure and training data are hidden.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have developed a technique to identify the origins of proprietary “black-box” AI models, even when their internal structure and training data are hidden.]]></sentence>  <summary><![CDATA[<div><div><div><div><div><div><p>Researchers have developed a technique to identify the origins of proprietary “black-box” AI models, even when their internal structure and training data are hidden. Because many commercial AI systems cannot be externally inspected, it is difficult to detect security vulnerabilities, intellectual property theft, licensing violations, or trace a model’s lineage. The new approach enables researchers to attribute models, determine whether one was derived from another, and identify potential misuse of protected data. By improving transparency and enabling verification of model provenance, the work strengthens accountability and trust in AI systems.</p></div></div></div></div></div></div>]]></summary>  <dateline>2026-02-25T00:00:00-05:00</dateline>  <iso_dateline>2026-02-25T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-02-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jpopham3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Popham</p><p>Communications Officer II&nbsp;School of Cybersecurity and Privacy&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679429</item>      </media>  <hg_media>          <item>          <nid>679429</nid>          <type>image</type>          <title><![CDATA[Is-this-your-AI.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Is-this-your-AI.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/02/25/Is-this-your-AI.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/02/25/Is-this-your-AI.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/02/25/Is-this-your-AI.jpg?itok=6Ayh_YfB]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A graphic showing an AI model in an outstretched hand. ]]></image_alt>                    <created>1772040810</created>          <gmt_created>2026-02-25 17:33:30</gmt_created>          <changed>1772040810</changed>          <gmt_changed>2026-02-25 17:33:30</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.ndss-symposium.org/wp-content/uploads/2026-s1628-paper.pdf]]></url>        <title><![CDATA[Read the Paper]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="2835"><![CDATA[ai]]></keyword>          <keyword tid="193860"><![CDATA[Artifical Intelligence]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="365"><![CDATA[Research]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="145171"><![CDATA[Cybersecurity]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="688223">  <title><![CDATA[Department of Energy Award to Power Nuclear Research With Machine Learning]]></title>  <uid>36319</uid>  <body><![CDATA[<p>The future of clean energy depends on algorithms as much as it does atoms.</p><p>Georgia Tech’s&nbsp;<a href="https://cse.gatech.edu/people/qi-tang"><strong>Qi Tang</strong></a> is building machine learning (ML) models to accelerate nuclear fusion research, making it more affordable and more accurate. Backed by a grant from the U.S. Department of Energy (DOE), Tang’s work brings clean, sustainable energy closer to reality.</p><p>Tang has received an&nbsp;<a href="https://science.osti.gov/early-career"><strong>Early Career Research Program (ECRP) award</strong></a> from the DOE Office of Science. The grant supports Tang with $875,000 disbursed over five years to craft ML and data processing tools that help scientists analyze massive datasets from nuclear experiments and simulations.</p><p>Tang is the first faculty member from Georgia Tech’s College of Computing and School of Computational Science and Engineering (CSE) to receive the ECRP. He is the seventh Georgia Tech researcher to earn the award and the only GT awardee among this year’s 99 recipients.</p><p>More than a milestone, the award reflects a shift in how nuclear research is done. Today, progress depends on computing and data science as much as on physics and engineering.</p><p>“I am honored and excited to receive the ECRP award through DOE’s Advanced Scientific Computing Research program, an organization I care about deeply,” said Tang, an assistant professor in the School of CSE.&nbsp;</p><p>“I am grateful to my former colleagues at Los Alamos National Laboratory and collaborators at other national laboratories, including Lawrence Livermore, Sandia, and Argonne. I am also thankful for my Ph.D. students at Georgia Tech, whose dedication and creativity make this award possible.”</p><p>[Related:&nbsp;<a href="https://www.cc.gatech.edu/news/new-faculty-applies-high-performance-computing-scientific-machine-learning-interests-studies"><strong>New Faculty Applies High-Performance Computing, Scientific Machine Learning Interests to Studies in Plasma Physics</strong></a>]</p><p>A problem in nuclear research is that fusion simulations are challenging to understand and use. These simulations generate enormous datasets that are too large to store, move, and analyze efficiently.</p><p><a href="https://pamspublic.science.energy.gov/WebPAMSExternal/Interface/Common/ViewPublicAbstract.aspx?rv=a756f612-3409-44b8-89ea-7421bf0840e5&amp;rtc=24&amp;PRoleId=10"><strong>In his ECRP proposal to DOE</strong></a>, Tang introduced new ML methods to improve the analysis and storage of particle data.</p><p>Tang’s approach balances shrinking data so it is easier to store and transfer while preserving the most important scientific features. His multiscale ML models are informed by physics, so the reduced data still reflects how fusion systems really behave.</p><p>With Tang’s research, scientists can run larger, more realistic fusion models and analyze results more quickly. This accelerates progress toward practical fusion energy.</p><p>“In contrast to generic black-box-type compression tools, we aim at preserving the intrinsic structures of the particle dataset during the data reduction processes,” Tang said.&nbsp;</p><p>“Taking this approach, we can meet our goal of achieving high-fidelity preservation of critical physics with minimum loss of information.”</p><p>Computing is essential in modern research because of the amount of data produced and captured from experiments and simulations. In the era of exascale supercomputers, data movement is a greater bottleneck than actual computation.</p><p>DOE operates three of the world’s four exascale supercomputers. These machines can calculate one quintillion (a billion billion) operations per second.</p><p>The exascale era began in 2022 with the launch of Frontier at Oak Ridge National Laboratory. Aurora followed in 2023 at Argonne National Laboratory. El Capitan arrived in 2024 at Lawrence Livermore National Laboratory.</p><p>With Tang’s data reduction approaches, all of DOE’s supercomputers spend more time on science and less time waiting for data transfers.</p><p>“Qi’s work in computational plasma physics and nuclear fusion modeling has been groundbreaking,” said <strong>Haesun Park</strong>, Regents’ Professor and Chair of the School of CSE.&nbsp;</p><p>“We are proud of Qi and what this award means for him, Georgia Tech, and the Department of Energy toward leveraging computation to solve challenges in science and engineering, such as sustainable energy."</p><p>&nbsp;</p><h6><strong>Previous Georgia Tech recipients of DOE Early Career Research Program awards include:</strong></h6><p><a href="https://www.gatech.edu/news/2024/09/26/doe-recognizes-georgia-tech-researchers-prestigious-early-career-awards"><strong>Itamar Kimchi</strong></a>, assistant professor, School of Physics</p><p><a href="https://www.gatech.edu/news/2024/09/26/doe-recognizes-georgia-tech-researchers-prestigious-early-career-awards"><strong>Sourabh Saha</strong></a>, assistant professor, George W. Woodruff School of Mechanical Engineering</p><p><a href="https://cos.gatech.edu/news/wenjing-liao-awarded-doe-early-career-award-model-simplification-deep-learning"><strong>Wenjing Lao</strong></a>, associate professor, School of Mathematics</p><p><a href="https://chbe.gatech.edu/news/2018/06/professor-lively-receives-does-early-career-award"><strong>Ryan Lively</strong></a>, Thomas C. DeLoach Professor, School of Chemical &amp; Biomolecular Engineering</p><p><a href="https://www.mse.gatech.edu/people/josh-kacher"><strong>Josh Kacher</strong></a>, associate professor, School of Materials Science and Engineering</p><p><a href="https://khabar.com/community-newsmakers/devesh-ranjan-receives-early-career-award-from-u-s-department-of-energy/"><strong>Devesh Ranjan</strong></a>, Eugene C. Gwaltney Jr. School Chair and professor, Woodruff School of Mechanical Engineering</p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1770909115</created>  <gmt_created>2026-02-12 15:11:55</gmt_created>  <changed>1774011151</changed>  <gmt_changed>2026-03-20 12:52:31</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech's Qi Tang has received an Early Career Research Program award from the Department of Energy's Office of Science. The $875,000 grant supports Tang for five years to craft ML tools that analyze data from nuclear experiments and simulations. ]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech's Qi Tang has received an Early Career Research Program award from the Department of Energy's Office of Science. The $875,000 grant supports Tang for five years to craft ML tools that analyze data from nuclear experiments and simulations. ]]></sentence>  <summary><![CDATA[<p>Georgia Tech’s&nbsp;<a href="https://cse.gatech.edu/people/qi-tang">Qi Tang</a> is building machine learning (ML) models to accelerate nuclear fusion research, making it more affordable and more accurate. Backed by a grant from the U.S. Department of Energy (DOE), Tang’s work brings clean, sustainable energy closer to reality.</p><p>Tang has received an&nbsp;<a href="https://science.osti.gov/early-career">Early Career Research Program (ECRP) award</a> from the DOE Office of Science. The grant supports Tang with $875,000 disbursed over five years to craft ML and data processing tools that help scientists analyze massive datasets from nuclear experiments and simulations.</p><p>Tang is the first faculty member from Georgia Tech’s College of Computing and School of Computational Science and Engineering (CSE) to receive the ECRP. He is the seventh Georgia Tech researcher to earn the award and the only GT awardee among this year’s 99 recipients.</p>]]></summary>  <dateline>2026-02-12T00:00:00-05:00</dateline>  <iso_dateline>2026-02-12T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-02-12 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679267</item>      </media>  <hg_media>          <item>          <nid>679267</nid>          <type>image</type>          <title><![CDATA[Qi-TangStory-Cover.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Qi-TangStory-Cover.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/02/12/Qi-TangStory-Cover.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/02/12/Qi-TangStory-Cover.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/02/12/Qi-TangStory-Cover.jpg?itok=b0qDlm0w]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[DOE ECRP Qi Tang]]></image_alt>                    <created>1770909124</created>          <gmt_created>2026-02-12 15:12:04</gmt_created>          <changed>1770909124</changed>          <gmt_changed>2026-02-12 15:12:04</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/department-energy-award-power-nuclear-research-machine-learning]]></url>        <title><![CDATA[Department of Energy Award to Power Nuclear Research with Machine Learning]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50877"><![CDATA[School of Computational Science and Engineering]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="144"><![CDATA[Energy]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="144"><![CDATA[Energy]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="166983"><![CDATA[School of Computational Science and Engineering]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="9167"><![CDATA[machine learning]]></keyword>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="663"><![CDATA[Department of Energy]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="688648">  <title><![CDATA[New ‘Touchable Sound’ Museum Display Makes Data More Accessible]]></title>  <uid>36530</uid>  <body><![CDATA[<p>Blind and low vision (BLV) people may soon have access to and more easily understand scientific data in museum exhibits through new “touchable sound” displays.</p><p>Associate Professor Jessica Roberts and Ph.D. student Emily Amspoker of Georgia Tech’s School of Interactive Computing are working with the <a href="https://gacoast.uga.edu/"><strong>University of Georgia’s Marine Extension and Georgia Sea Grant in Savannah</strong></a>. Together, they’ve developed a prototype display that uses sonification and texture to convey sea floor habitat information from <a href="https://graysreef.noaa.gov/"><strong>Gray’s Reef National Marine Sanctuary</strong></a> off the coast of Georgia.</p><p>Sonification is the process of translating data points into sound.</p><p>The display functions as a map that BLV users can follow to learn about each habitat. It is made from a wooden board with laser-cut patterns engraved into the surface. Each pattern represents information about the four types of habitats found in Gray’s Reef. Each pattern has a distinct sound that corresponds to a legend on the board, which provides an audio description of each habitat.</p><p>The four habitats are:</p><ul><li>Flat sand — smooth sandy seafloor with little topographic variation that provides habitat for burrowing organisms such as worms, clams, and sand dollars.</li><li>Rippled sand — sandy bottom shaped into small wave-like ridges by currents and wave action; supports microhabitats of small invertebrates and attracts fish feeding on buried prey.</li><li>Sparse live bottom — areas of exposed hard surfaces with scattered attached organisms like sponges, corals, and algae, offering structure and shelter for reef-associated fish and invertebrates.</li><li>Dense live bottom — hard-bottom reef areas with abundant attached marine life, providing high biodiversity and offering food, and breeding sites for numerous species.</li></ul><p>By allowing learners to explore these habitats, the team hopes to emphasize the importance of protecting diverse ocean habitats.&nbsp;</p><p>“Our job was to figure out how we can use sounds and touch to represent each of the four habitat types so our visitors can explore the ocean without being able to see it,” she said.</p><p>Roberts said the project is critical to advance understanding of how science and informal learning can be more inclusive to those who have difficulty processing visual data displays.</p><div><div><p>“This was particularly exciting to figure out how we could broaden accessibility to data sets because just like so much other scientific data, it’s out there and available, but when it’s presented to the public, it’s usually in visual form,” she said. “There are many open questions about how to do this well within a museum with complex scientific data. We’re moving the needle on that, but there’s a long way to go.”</p><h4><strong>Right Combination</strong></h4><p>Amspoker and Roberts created three different versions of the prototype. One was sound-only, one was texture-only, and the other was a combination of sound and texture.</p><p>“We expected the multimodal version would work best,” Amspoker said. “We found people used sound and texture in different ways when interacting with it. In cases where people relied on texture, it was still difficult to tell when they crossed the barrier from one texture to another. Sound was very useful in that case.”</p><p>Amspoker said computer vision and an app she designed allow the technology to be deployed on any surface, whether a mobile device, a wooden board, or even a classroom floor. A camera set up above the display tracks the user’s hand movements.</p><p>“It figures out where you are on the board, and then our code uses the location of your finger to decide what sound should play from the computer,” she said. “What’s nice about our system is it only needs a computer and a webcam, and you can use whatever materials you have on hand for the map.”</p><h4><strong>Building on a Legacy</strong></h4><p>Roberts said she is building on the work of a previous NSF-funded collaboration with Dr. Amy Bower, a senior scientist at the Woods Hole Oceanographic Institute in Massachusetts who is blind.</p><p>Bower lost her vision in graduate school, but because of her lifelong interest in oceanography, she set out to create ways to learn about ocean data through sound.&nbsp;</p><p>In 2021, she launched the <a href="https://accessibleoceans.whoi.edu/"><strong>Accessible Oceans</strong></a> project through the National Science Foundation’s Advancing Informal STEM Learning program. The interdisciplinary team, including Roberts and collaborators Leslie Smith of Your Ocean Consulting and Jon Bellona of the University of Oregon, created auditory displays of sonified data for museums.</p><p>In 2023, the team published <a href="https://tos.org/oceanography/article/expanding-access-to-ocean-science-through-inclusively-designed-data-sonifications"><strong>an article in </strong><em><strong>Oceanography,</strong></em><strong> the official magazine of the Oeanography Society</strong></a>.</p><p>“Informal learning environments are increasingly recognizing the importance of employing multiple modalities to engage all learners and are leveraging sound to enhance visitor experience,” the authors wrote.</p><p>“While sonic additions of music, soundscapes, and field recordings add qualitative value, there is a need to explore the potential of sound to facilitate engagement with quantitative information. Data sonification is a promising avenue for increasing accessibility to data within the museum context.”</p></div></div>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1772550783</created>  <gmt_created>2026-03-03 15:13:03</gmt_created>  <changed>1774011129</changed>  <gmt_changed>2026-03-20 12:52:09</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers have developed a prototype “touchable sound” museum display that uses sonification and tactile maps to make complex scientific data about ocean habitats more accessible to blind and low-vision visitors.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers have developed a prototype “touchable sound” museum display that uses sonification and tactile maps to make complex scientific data about ocean habitats more accessible to blind and low-vision visitors.]]></sentence>  <summary><![CDATA[<p>Georgia Tech researchers have created a prototype “touchable sound” museum exhibit that helps blind and low-vision visitors explore scientific data by combining tactile maps with sonification of seafloor habitats. The display translates information about different ocean environments into distinctive textures and sounds so users can follow a physical map of Gray’s Reef National Marine Sanctuary and hear data-driven audio cues. The team hopes this multimodal approach will make complex visual data more inclusive and broaden access to informal science learning.</p>]]></summary>  <dateline>2026-03-03T00:00:00-05:00</dateline>  <iso_dateline>2026-03-03T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-03-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679503</item>      </media>  <hg_media>          <item>          <nid>679503</nid>          <type>image</type>          <title><![CDATA[2026-Jessica-Roberts-Reef-Data-Sonification-2.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[2026-Jessica-Roberts-Reef-Data-Sonification-2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/03/2026-Jessica-Roberts-Reef-Data-Sonification-2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/03/2026-Jessica-Roberts-Reef-Data-Sonification-2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/03/2026-Jessica-Roberts-Reef-Data-Sonification-2.jpg?itok=js9WCZEU]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Jessica Roberts]]></image_alt>                    <created>1772550793</created>          <gmt_created>2026-03-03 15:13:13</gmt_created>          <changed>1772550793</changed>          <gmt_changed>2026-03-03 15:13:13</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>      </news_terms>  <keywords>          <keyword tid="360"><![CDATA[accessibility]]></keyword>          <keyword tid="194701"><![CDATA[go-resarchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="9092"><![CDATA[museums]]></keyword>          <keyword tid="181370"><![CDATA[oceanography]]></keyword>          <keyword tid="176552"><![CDATA[data sonification]]></keyword>          <keyword tid="1102"><![CDATA[blind]]></keyword>          <keyword tid="2751"><![CDATA[visually impaired]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="688916">  <title><![CDATA[ Undergrads Earn National Recognition for Computing Research]]></title>  <uid>36530</uid>  <body><![CDATA[<p>Two Georgia Tech undergraduates are being recognized for their contributions to computing research.&nbsp;</p><p><strong>Ryan&nbsp;Punamiya</strong>&nbsp;(CS 2025)&nbsp;and <strong>Summer Abramson</strong>, a third-year&nbsp;computational&nbsp;media student, have been honored by the Computing Research Association (CRA) through its 2025–2026 <a href="https://cra.org/about/awards/outstanding-undergraduate-researcher-award/"><strong>Outstanding Undergraduate Researcher Award (URA) program.&nbsp;</strong></a></p><p>Punamiya&nbsp;was named a runner-up for the prestigious award, while Abramson received an honorable mention among hundreds of applicants from universities across North America.&nbsp;</p><p>The&nbsp;<a href="https://cra.org/about/awards/outstanding-undergraduate-researcher-award/"><strong>CRA Outstanding Undergraduate Researcher Award program</strong></a>&nbsp;recognized eight awardees in 2026, along with eight runners-up, nine finalists, and over 200 honorable mentions from thousands of applications.&nbsp;&nbsp;</p><h4><strong>Advancing&nbsp;Robotics Research&nbsp;</strong></h4><p>Punamiya&nbsp;knew early on that he&nbsp;didn’t&nbsp;want to wait until starting his Ph.D. to do meaningful and impactful robotics research.&nbsp;&nbsp;</p><p>Punamiya&nbsp;joined the Robot Learning and Reasoning Lab (RL2) directed by Assistant Professor&nbsp;Danfei&nbsp;Xu. While there, he contributed to the lab’s Meta-sponsored&nbsp;<a href="https://www.cc.gatech.edu/news/new-algorithm-teaches-robots-through-human-perspective"><strong>EgoMimic</strong></a>&nbsp;project, which trains robots to perform human tasks using recordings captured by Meta’s Project Aria research glasses.&nbsp;</p><p>Punamiya&nbsp;is&nbsp;also the first author of a paper accepted to the 2025 Conference on Neural Information Processing Systems (NeurIPS),&nbsp;one of the world’s most prestigious artificial intelligence (AI) and machine learning conferences.&nbsp;</p><p>“Ryan is the strongest undergraduate I've worked with,” Xu said, “including students who went on to Stanford, Berkeley, and leadership roles in major tech companies.&nbsp;He’s&nbsp;already&nbsp;operating&nbsp;at the level of a strong&nbsp;third-year Ph.D.&nbsp;student.”&nbsp;</p><p>Punamiya&nbsp;said it was a challenge to balance his undergraduate coursework with his research in Xu’s lab.&nbsp;</p><p>“You get out how much you put in,”&nbsp;he&nbsp;said.&nbsp;“I built my class schedule to give myself as much time to do research as possible. It also boils down to having the right research mentors.&nbsp;</p><p>“(Xu) never saw me as an&nbsp;undergrad&nbsp;who’s&nbsp;just there to do grunt work. I was&nbsp;fortunate&nbsp;he saw my curiosity and cultivated me as a researcher.&nbsp;That’s&nbsp;really how&nbsp;you get more&nbsp;undergrads&nbsp;motivated to research — giving them the chance to be independent and explore ideas of their own.”&nbsp;</p><p>Punamiya&nbsp;said his work in Xu’s lab has already helped him identify the research areas he wants to focus on as he considers his next steps. He will continue developing generalized training models for robots using human data so they can perform tasks instantly upon deployment.&nbsp;</p><p>"The amount of data needed to train a robot is difficult to obtain even for top industry companies," he said. "We have embodied robot data available in billions of humans. With the advent of extended reality devices, we can get a scalable source of diverse interactions within environments."</p><p>Punamiya&nbsp;graduated in December and recently started an internship at Nvidia. He mentioned he has been accepted into several Ph.D. programs, including Georgia Tech, and he is choosing where to continue his research.&nbsp;</p><p>“It’s the first time my research has been&nbsp;acknowledged&nbsp;externally by the robotics community,” he said. “It’s&nbsp;good to&nbsp;know&nbsp;the problem&nbsp;I’m&nbsp;working on is important, and that motivates me. Robotics is an exciting field. We are doing things now that two years ago were difficult to do.”&nbsp;</p><h4><strong>Researching Inclusion in Computing Education&nbsp;</strong></h4><p>Abramson conducts research in the People-Agents Research for Computing Education (PARCE) Laboratory under the mentorship of&nbsp;Pedro Guillermo Feijóo-García, a faculty member&nbsp;in the School of Computing Instruction. He and the Associate Dean for Undergraduate Education, Olufisayo Omojokun, nominated her for the award.&nbsp;</p><p>Her work focuses on the intersection of computing education and human-AI interaction, where she’s been exploring ways to create more equitable technology.&nbsp;</p><p>“This is such a huge milestone, and I couldn't be prouder of Summer,” Feijóo-García said. “Mentoring her for almost two years has been an amazing experience.”&nbsp;</p><p>Abramson has received the Georgia Tech President’s Undergraduate Research Award (PURA) twice, which supports her research exploring how user-centered design curricula can help address attrition among women in computing.</p><p>“I’ve had the amazing opportunity to pursue research at the intersection of student identity, community belonging, and how we can build tools that support our diverse student population,” Abramson said.&nbsp;</p><p>“Dr. Pedro and I have a goal to build community through a human-first approach, and I could not be more grateful for his support and guidance in my own journey. The CRA highlights the best of what the computing discipline has to offer, and I am incredibly honored for our work to be recognized.”</p><p>Abramson will spend the summer researching how user-centered design curricula can help promote confidence, belonging, and retention for women in computing.</p><p>Nominees for the PURA program were recognized for contributing to multiple research projects, authoring or coauthoring papers, presenting at conferences, developing widely used software artifacts, and supporting their communities as teaching assistants, tutors, and mentors.&nbsp;</p><p><em>School of Computing Instruction Communications Officer Emily Smith contributed to this story.</em></p><p><em>Main Photo: Ryan Punamiya works with a robot during the 2025 International Conference on Robotics and Automation in Atlanta. Photo by Terence Rushin/College of Computing.</em></p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1773413846</created>  <gmt_created>2026-03-13 14:57:26</gmt_created>  <changed>1774011081</changed>  <gmt_changed>2026-03-20 12:51:21</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Ryan Punamiya (CS 2025) and Summer Abramson, a third-year computational media student, have been honored by the Computing Research Association (CRA) through its 2025–2026 Outstanding Undergraduate Researcher Award (URA) program. ]]></teaser>  <type>news</type>  <sentence><![CDATA[Ryan Punamiya (CS 2025) and Summer Abramson, a third-year computational media student, have been honored by the Computing Research Association (CRA) through its 2025–2026 Outstanding Undergraduate Researcher Award (URA) program. ]]></sentence>  <summary><![CDATA[<p><strong>Ryan&nbsp;Punamiya</strong>&nbsp;(CS 2025)&nbsp;and <strong>Summer Abramson</strong>, a third-year&nbsp;computational&nbsp;media student, have been honored by the Computing Research Association (CRA) through its 2025–2026 <a href="https://cra.org/about/awards/outstanding-undergraduate-researcher-award/"><strong>Outstanding Undergraduate Researcher Award (URA) program.&nbsp;</strong></a></p><p>Punamiya&nbsp;was named a runner-up for the prestigious award, while Abramson received an honorable mention among hundreds of applicants from universities across North America.&nbsp;</p><p>The&nbsp;<a href="https://cra.org/about/awards/outstanding-undergraduate-researcher-award/"><strong>CRA Outstanding Undergraduate Researcher Award program</strong></a>&nbsp;recognized eight awardees in 2026, along with eight runners-up, nine finalists, and over 200 honorable mentions from thousands of applications.&nbsp;</p>]]></summary>  <dateline>2026-03-13T00:00:00-04:00</dateline>  <iso_dateline>2026-03-13T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-03-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679613</item>      </media>  <hg_media>          <item>          <nid>679613</nid>          <type>image</type>          <title><![CDATA[ICRA-2025_P9A0421-Enhanced-NR.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ICRA-2025_P9A0421-Enhanced-NR.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/13/ICRA-2025_P9A0421-Enhanced-NR.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/13/ICRA-2025_P9A0421-Enhanced-NR.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/13/ICRA-2025_P9A0421-Enhanced-NR.jpg?itok=vnBCPFhq]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ryan Punamiya]]></image_alt>                    <created>1773413856</created>          <gmt_created>2026-03-13 14:57:36</gmt_created>          <changed>1773413856</changed>          <gmt_changed>2026-03-13 14:57:36</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="193158"><![CDATA[Student Competition Winners (academic, innovation, and research)]]></category>          <category tid="193157"><![CDATA[Student Honors and Achievements]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="193158"><![CDATA[Student Competition Winners (academic, innovation, and research)]]></term>          <term tid="193157"><![CDATA[Student Honors and Achievements]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="101271"><![CDATA[Computing Research Association]]></keyword>          <keyword tid="22861"><![CDATA[undergraduate research awards]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="688716">  <title><![CDATA[New Research Priorities Chart Course Toward Impactful, Energy-Efficient Computing]]></title>  <uid>36319</uid>  <body><![CDATA[<p>Georgia Tech researchers applied their expertise to a national research program that will shape the future of computing. Their work may yield more energy-efficient computers and better predictions for environmental challenges like carbon storage, tsunamis, wildfires, and sustainable energy.&nbsp;</p><p>The Department of Energy Office of Science recently released two reports through its Advanced Scientific Computing Research (<a href="https://www.energy.gov/science/ascr/advanced-scientific-computing-research">ASCR</a>) program. The&nbsp;<a href="https://science.osti.gov/ascr/Community-Resources/Program-Documents">reports</a> were produced by workshops that brought together researchers from universities, national labs, government, and industry to set priorities for scientific computing.</p><p>Professor&nbsp;<a href="https://slim.gatech.edu/people/felix-j-herrmann">Felix Herrmann</a> served on the organizing committee for the Workshop on Inverse Methods for Complex Systems under Uncertainty. Assistant Professor&nbsp;<a href="https://faculty.cc.gatech.edu/~pchen402/group.html">Peng Chen</a> joined Herrmann as a workshop participant, contributing expertise in data science and machine learning.</p><p>Inverse methods work backward from outcomes to find their causes. Scientists use these tools to study complex systems, like designing new materials with targeted properties and using past wildfires to map vulnerable areas and behavior of future fires.</p><p>The&nbsp;<a href="https://www.osti.gov/biblio/2583339">ASCR report</a> highlighted Herrmann’s work on seismic exploration and monitoring through digital twins. Founded on inverse methods, digital twins upgrade from static models to virtual systems that accurately mirror their physical counterparts.&nbsp;</p><p>Digital twins integrate real-time data sources, including fluid flows, monitoring and control systems, risk assessments, and human decisions. These models also account for uncertainty and address data gaps or limitations.&nbsp;</p><p>The DOE organized the workshop to support the growing role of inverse modeling. The group identified four priority research directions (PRDs) to guide future work. The PRDs are:</p><ul><li>PRD 1: Discovering, exploiting, and preserving structure</li><li>PRD 2: Identifying and overcoming model limitations</li><li>PRD 3: Integrating disparate multimodal and/or dynamic data</li><li>PRD 4: Solving goal-oriented inverse problems for downstream tasks</li></ul><p>“A digital twin is a system you can control, like to optimize operations or to minimize risk,” said Herrmann, who holds joint appointments in the Schools of Earth and Atmospheric Sciences, Electrical and Computer Engineering, and Computational Science and Engineering.</p><p>“Digital twins give you a principled way to consider uncertainties, which there are a lot in subsurface monitoring. If you inject carbon dioxide too fast, you will will increase the pressure and may fracture the rock. If you inject too slow, then the process may become too costly. Digital twins help us make balanced decisions under uncertainty.”</p><p>Supercomputers, algorithms, and artificial intelligence now power modern science. However, these tools consume enormous amounts of energy. This raises concerns about how to sustain computing and scientific research as we know them in the decades ahead.</p><p>Professors&nbsp;<a href="https://vuduc.org/v2/">Rich Vuduc</a> and&nbsp;<a href="https://hyesoon.github.io/">Hyesoon Kim</a> co-authored&nbsp;<a href="https://www.osti.gov/biblio/2476961">the report</a> from the Workshop on Energy-Efficient Computing for Science. At the three-day ASCR workshop, participants identified five key research directions:</p><ul><li>PRD 1: Co-design energy-efficient hardware devices and architectures for important workloads</li><li>PRD 2: Define the algorithmic foundations of energy-efficient scientific computing</li><li>PRD 3: Reconceptualize software ecosystems for energy efficiency</li><li>PRD 4: Enable energy-efficient data management for data centers, instruments, and users</li><li>PRD 5: Develop integrated, scalable energy measurement and modeling capabilities for next-generation computing systems</li></ul><p>“I’m cautiously optimistic about the future of energy-efficient computing. The ASCR report says, from a technological point of view, there are things we can do,” said Vuduc.</p><p>“The report lays out paths for how we might design better apps, hardware systems, and algorithms that will use less energy. This is recognition that we should think about how architectures and software work together to drive down energy usage for systems.”</p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1772630984</created>  <gmt_created>2026-03-04 13:29:44</gmt_created>  <changed>1772658078</changed>  <gmt_changed>2026-03-04 21:01:18</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech faculty members contributed to two DOE Advanced Scientific Computing Research program workshops. Recently published reports of their work may yield more energy-efficient computers and better predictions for environmental challenges.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech faculty members contributed to two DOE Advanced Scientific Computing Research program workshops. Recently published reports of their work may yield more energy-efficient computers and better predictions for environmental challenges.]]></sentence>  <summary><![CDATA[<p>Georgia Tech researchers applied their expertise to a national research program that will shape the future of computing. Their work may yield more energy-efficient computers and better predictions for environmental challenges like carbon storage, tsunamis, wildfires, and sustainable energy.&nbsp;</p><p>The Department of Energy Office of Science recently released two reports through its Advanced Scientific Computing Research (<a href="https://www.energy.gov/science/ascr/advanced-scientific-computing-research">ASCR</a>) program. The&nbsp;<a href="https://science.osti.gov/ascr/Community-Resources/Program-Documents">reports</a> were produced by workshops that brought together researchers from universities, national labs, government, and industry to set priorities for scientific computing.</p>]]></summary>  <dateline>2026-02-27T00:00:00-05:00</dateline>  <iso_dateline>2026-02-27T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-02-27 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679513</item>          <item>679514</item>          <item>679515</item>      </media>  <hg_media>          <item>          <nid>679513</nid>          <type>image</type>          <title><![CDATA[ASCR-Report-Authors.png]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ASCR-Report-Authors.png]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/04/ASCR-Report-Authors.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/04/ASCR-Report-Authors.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/04/ASCR-Report-Authors.png?itok=TI8M78es]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[DOE Office of Science ASCR Reports]]></image_alt>                    <created>1772630996</created>          <gmt_created>2026-03-04 13:29:56</gmt_created>          <changed>1772630996</changed>          <gmt_changed>2026-03-04 13:29:56</gmt_changed>      </item>          <item>          <nid>679514</nid>          <type>image</type>          <title><![CDATA[ASCR-Report-Inverse-methods.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ASCR-Report-Inverse-methods.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/04/ASCR-Report-Inverse-methods.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/04/ASCR-Report-Inverse-methods.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/04/ASCR-Report-Inverse-methods.jpg?itok=Id4-FQxK]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[ASCR Workshop on Inverse Methods for Complex Systems under Uncertainty]]></image_alt>                    <created>1772631052</created>          <gmt_created>2026-03-04 13:30:52</gmt_created>          <changed>1772631052</changed>          <gmt_changed>2026-03-04 13:30:52</gmt_changed>      </item>          <item>          <nid>679515</nid>          <type>image</type>          <title><![CDATA[ASCR-Report-Energy-Efficient-Computing.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ASCR-Report-Energy-Efficient-Computing.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/04/ASCR-Report-Energy-Efficient-Computing.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/04/ASCR-Report-Energy-Efficient-Computing.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/04/ASCR-Report-Energy-Efficient-Computing.jpg?itok=FG7IdP7N]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[ASCR Workshop on Energy-Efficient Computing for Science]]></image_alt>                    <created>1772631087</created>          <gmt_created>2026-03-04 13:31:27</gmt_created>          <changed>1772631087</changed>          <gmt_changed>2026-03-04 13:31:27</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/new-research-priorities-chart-course-toward-impactful-energy-efficient-computing]]></url>        <title><![CDATA[New Research Priorities Chart Course Toward Impactful, Energy-Efficient Computing]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="144"><![CDATA[Energy]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="144"><![CDATA[Energy]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="166983"><![CDATA[School of Computational Science and Engineering]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="663"><![CDATA[Department of Energy]]></keyword>          <keyword tid="179230"><![CDATA[digital twin]]></keyword>          <keyword tid="15030"><![CDATA[high-performance computing]]></keyword>          <keyword tid="9167"><![CDATA[machine learning]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="688502">  <title><![CDATA[Understanding the Data Center Building Boom ]]></title>  <uid>27338</uid>  <body><![CDATA[<p><em>Written by: Anne Wainscott-Sargent</em></p><p>As artificial intelligence (AI) drives explosive growth in data centers, communities across the U.S. are facing rising electricity costs, new industrial development, and mounting strain on an aging power grid.</p><p>At Georgia Tech, several faculty members are approaching these sustainability challenges from different but complementary angles: examining how data center policy affects local communities, modeling how AI-driven demand reshapes regional energy systems, and building tools that help the public understand the tradeoffs embedded in grid planning. Together, their work highlights how better data, thoughtful policy, and public engagement can guide more resilient and equitable decisions in an AI-powered future.</p><p><strong>AI’s Hidden Footprint: How Data Centers Reshape Communities</strong></p><p>Ahmed Saeed studies the infrastructure most people never see. An assistant professor in the School of Computer Science and a Brook Byers Institute for Sustainable Systems (BBISS) Faculty Fellow, Saeed focuses on how data centers — the backbone of modern AI — are built, operated, and regulated, and what their growth means for host communities.</p><p>“Data centers are the infrastructure for our digital life, so more of them are necessary to keep doing what we’re doing,” he said.</p><p>Data center energy consumption could double or triple by 2028, accounting for up to 12% of U.S. electricity use, according to a <a href="https://escholarship.org/uc/item/32d6m0d1">report by Lawrence Berkeley National Laboratory</a>. U.S. spending on data center construction jumped nearly 70% between May 2023 and May 2024, according to the <a href="https://americanedgeproject.org/wp-content/uploads/2025/12/Americas-AI-Surge-Powering-Growth-in-Every-State.pdf">American Edge Project</a>.</p><p>Georgia is an AI data center hub, ranked fourth globally, with $4.6 billion in AI-related venture capital invested across 368 deals, the American Edge Project reported. At a recent <a href="https://www.cc.gatech.edu/news/sustainability-fellowship-supports-professors-data-center-research">town hall in DeKalb County, Georgia</a>, Saeed helped residents connect AI’s promise to its local consequences. Training large AI models can require tens of thousands of graphics processing units (GPUs) running for days or weeks, driving an unprecedented wave of data center construction. AI-focused chips, he noted, can consume 10 to 14 times more power than traditional processors.</p><p>That demand often shows up as pressure on local infrastructure. Communities are increasingly concerned about electricity and water use, grid upgrades, and who ultimately pays. In Virginia, Saeed pointed to a legal dispute in which consumer advocates warned that data centers could raise electricity bills by 5% in the short term and up to 50% over time, while utilities argued those investments were inevitable and could benefit customers in the long run.</p><p>Environmental concerns add another layer. Saeed cited controversies over water use and backup diesel generators in states, including Georgia and Tennessee, alongside a recent Environmental Protection Agency (EPA) ruling that tightened generator regulations. While diesel generators are clearly harmful, he cautioned that long-term, rigorous evidence linking data centers to regional health impacts remains limited.</p><p>Saeed’s research aims to reduce those impacts directly. By optimizing how workloads are scheduled across large server fleets, his team has demonstrated power savings of 4 – 12%, a meaningful gain if U.S. data centers approach projected levels of up to 12% of national electricity use by 2028.</p><p>For Saeed, data centers are akin to highways: essential to modern life, disruptive to nearby communities, and shaped by policy choices. The question, he argues, is not whether AI infrastructure should exist, but how transparently and fairly it is built.</p><p><strong>Economist Probes the Energy Costs of the AI Boom</strong></p><p>While headlines often frame AI as an energy crisis, Georgia Tech environmental and energy economist and BBISS Faculty Fellow Tony Harding is focused on measuring its real — and uneven — impacts. Harding, an assistant professor in the Jimmy and Rosalynn Carter School of Public Policy, uses economic modeling to examine how AI adoption affects energy use, emissions, and local communities.</p><p>In <a href="https://iopscience.iop.org/article/10.1088/1748-9326/ae0e3b">recent work</a> published in <em>Environmental Research Letters</em>, Harding and his co-author analyzed how productivity gains from AI could influence national energy demand. Their findings suggest that, at a macro level, AI-related activity may increase annual U.S. energy use by about 0.03% and CO₂ emissions by roughly 0.02%.</p><p>“Those numbers are small in the context of the overall economy,” Harding said. “But the impacts are highly uneven.”</p><p>That unevenness is evident in where data centers are built. While Northern Virginia remains the country’s top data center hub, with 343 operational data centers, states like Georgia, which currently has 94 operational data centers, are rapidly attracting facilities due to reliable power and favorable tax policies.&nbsp;</p><p>Harding’s latest research focuses on local effects, asking why data centers cluster in urban areas, how they influence housing markets, what happens to electricity prices, and whether they exacerbate water stress. Early evidence suggests large facilities can increase local electricity rates, contributing to public backlash and regulatory response. In Georgia, the <a href="https://psc.ga.gov/site/assets/files/8617/media_advisory_data_centers_rule_1-23-2025.pdf">Public Service Commission</a> has begun requiring new, high power draw customers (like data centers) to cover more of the costs associated with grid expansion.</p><p>Harding’s goal is to give policymakers better evidence to design incentives and guardrails. “To manage these technologies responsibly,” he said, “we need a clear picture of their intended and unintended consequences.”</p><p><strong>Gamifying a Strained and Aging Power Grid</strong></p><p>Daniel Molzahn is tackling another side of the problem: how to modernize an aging power grid under growing demand. Electricity demand is expected to rise about 25% by 2030, driven by data centers, electric vehicles, and broadscale electrification. At the same time, much of the U.S. electricity grid is nearing the end of its lifespan, with many transformers being decades old.</p><p>To make these challenges tangible, Molzahn, an associate professor in the School of Electrical and Computer Engineering, developed a browser-based game with a group of students through Georgia Tech’s <a href="https://vip.gatech.edu/frm_display/team-listings/entry/1303/">Vertically Integrated Projects</a> program called <a href="https://currentcrisis.itch.io/current-crisis">Current Crisis</a>. Players take on the role of a utility decision-maker, balancing reliability, wildfire risk, renewable integration, and affordability.</p><p>The game grew out of Molzahn’s National Science Foundation CAREER award and reflects his belief that complex systems are best understood experientially. Its initial focus is wildfire resilience, modeling how grid infrastructure can both spark and suffer damage from fires.</p><p>But resilience comes at a cost. Burying power lines, for example, reduces wildfire risk but dramatically increases expenses. Players must confront the same tradeoffs utilities face: improve reliability or keep rates low.</p><p>Molzahn hopes the game will help students and the public grapple with the realities of planning future power systems. “These choices aren’t abstract,” he said. “They shape affordability, resilience, and our path toward a cleaner grid.”</p><p>The project now involves nearly 40 students from across campus, supported by Sustainability NEXT funding and a collaboration with Jessica Roberts, former BBISS Faculty Fellow and director of the <a href="https://tiles.cc.gatech.edu/">Technology-Integrated Learning Environments (TILES) Lab</a> in the School of Interactive Computing.</p><p>“As a learning scientist, I look at how to engage people with science and scientific data and get people having conversations they might not otherwise have,” says Roberts, who hopes the seed grant helps the team determine first that they are going in the right direction and, second, how to broaden the impact.</p><p>One student, Stella Quinto Lima, a graduate research assistant in Human-Centered Computing, has made the game the focus of her doctoral thesis. Through the game, she wants players to notice their misconceptions about the power grid, energy use, and AI, and to use critical thinking to identify, question, and possibly undo those misconceptions.</p><p>&nbsp;“I hope that we can really engage adults and help them see it’s not black and white. The game is not only about power grids, but how AI affects the grid, how it affects our lives, and how it will impact our future.”</p><p>The team plans to expand the game’s features, use it in outreach programs, and analyze player decisions as a source of data to study energy-system decision-making.</p><p>“We want to change the conversation about power and power grid stability, reliability, and sustainability, Roberts said, “and find a way to get this message to a larger public.”</p>]]></body>  <author>Brent Verrill</author>  <status>1</status>  <created>1771964950</created>  <gmt_created>2026-02-24 20:29:10</gmt_created>  <changed>1772037822</changed>  <gmt_changed>2026-02-25 16:43:42</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Explosive data center growth requires research to inform policies which manage the building of this critical infrastructure.]]></teaser>  <type>news</type>  <sentence><![CDATA[Explosive data center growth requires research to inform policies which manage the building of this critical infrastructure.]]></sentence>  <summary><![CDATA[<p>As artificial intelligence (AI) drives explosive growth in data centers, communities across the U.S. are facing rising electricity costs, new industrial development, and mounting strain on an aging power grid.</p>]]></summary>  <dateline>2026-02-24T00:00:00-05:00</dateline>  <iso_dateline>2026-02-24T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-02-24 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[brent.verrill@research.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:brent.verrill@research.gatech.edu">Brent Verrill</a>, Research Communications Program Manager, BBISS</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679428</item>      </media>  <hg_media>          <item>          <nid>679428</nid>          <type>image</type>          <title><![CDATA[Giarusso_Saeed_Molzhan_Headshots_Collage_Sized]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Giarusso_Saeed_Molzhan_Headshots_Collage_Sized.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/02/25/Giarusso_Saeed_Molzhan_Headshots_Collage_Sized.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/02/25/Giarusso_Saeed_Molzhan_Headshots_Collage_Sized.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/02/25/Giarusso_Saeed_Molzhan_Headshots_Collage_Sized.jpg?itok=LtgNnP32]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Three men's individual portrait-style photos are arranged side by side, each showing a person from the shoulders up. The individuals wear collared shirts and appear in different lighting settings, including a dark background, a neutral studio backdrop, and a bright white background.]]></image_alt>                    <created>1772037433</created>          <gmt_created>2026-02-25 16:37:13</gmt_created>          <changed>1772037615</changed>          <gmt_changed>2026-02-25 16:40:15</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="244191"><![CDATA[Brook Byers Institute for Sustainable Systems]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="660398"><![CDATA[Sustainability Hub]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="131"><![CDATA[Economic Development and Policy]]></category>          <category tid="144"><![CDATA[Energy]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="194611"><![CDATA[State Impact]]></category>          <category tid="194836"><![CDATA[Sustainability]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="131"><![CDATA[Economic Development and Policy]]></term>          <term tid="144"><![CDATA[Energy]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="194611"><![CDATA[State Impact]]></term>          <term tid="194836"><![CDATA[Sustainability]]></term>      </news_terms>  <keywords>          <keyword tid="188360"><![CDATA[go-bbiss]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="194566"><![CDATA[Sustainable Systems]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="687708">  <title><![CDATA[ Researchers Warn AI ‘Blind Spot’ Could Allow Attackers to Hijack Self-Driving Vehicles]]></title>  <uid>36253</uid>  <body><![CDATA[<div><div><p>A newly discovered vulnerability could allow cybercriminals to silently hijack the artificial intelligence (AI) systems in self-driving cars, raising concerns about the security of autonomous systems increasingly used on public roads.</p><p>&nbsp;Georgia Tech cybersecurity researchers discovered the vulnerability, dubbed VillainNet, and found it can remain dormant in a self-driving vehicle’s AI system until triggered by specific conditions.</p><p>Once triggered, VillainNet is almost certain to succeed, giving attackers control of the targeted vehicle.</p><p>The research finds that attackers could program almost any action within a self-driving vehicle’s AI super network to trigger VillainNet. In one possible scenario, it could be triggered when a self-driving taxi’s AI responds to rainfall and changing road conditions.</p><p>Once in control, hackers could hold the passengers hostage and threaten to crash the taxi.</p><p>The researchers discovered this new backdoor attack threat in the AI super networks that power autonomous driving systems.&nbsp;</p><p>“Super networks are designed to be the Swiss Army knife of AI, swapping out tools, or in this case sub networks, as needed for the task at hand," said <a href="https://davidoygenblik.github.io/"><strong>David Oygenblik</strong></a>, Ph.D. student at Georgia Tech and the lead researcher on the project.&nbsp;</p><p>"However, we found that an adversary can exploit this by attacking just one of those tiny tools. The attack remains completely dormant until that specific subnetwork is used, effectively hiding across billions of other benign configurations."&nbsp;</p><p>This backdoor attack is nearly guaranteed to work, according to Oygenblik. This blind spot is nearly undetectable with current tools and can impact any autonomous vehicle that runs on AI. It can also be hidden at any stage of development and include billions of scenarios.</p><p>“With VillainNet, the attacker forces defenders to find a single needle in a haystack that can be as large as 10 quintillion straws," said Oygenblik.&nbsp;</p><p>"Our work is a call to action for the security community. As AI systems become more complex and adaptive, we must develop new defenses capable of addressing these novel, hyper-targeted threats."&nbsp;</p><p>The hypothetical fix to the problem was to add security measures to the super networks. These networks contain billions of specialized subnetworks that can be activated on the fly, but Oygenblik wanted to see what would happen if he attacked a single subnetwork tool.</p><p>In experiments, the VillainNet attack proved highly effective. It achieved a 99% success rate when activated while remaining invisible throughout the AI system.&nbsp;</p><p>The research also shows that detecting a VillainNet backdoor would require 66x more computing power and time to verify the AI system is safe. This challenge dramatically expands the search space for attack detection and is not feasible, according to the researchers.</p><p>The project was <a href="https://www.youtube.com/watch?v=H1fyPD8vWDo">presented</a> at the ACM Conference on Computer and Communications Security (CCS) in October 2025. The paper, <a href="https://davidoygenblik.github.io/pdfs/VNET.pdf"><em>VillainNet: Targeted Poisoning Attacks Against SuperNets Along the Accuracy-Latency Pareto Frontier</em></a>, was co-authored by Oygenblik, master's students <strong>Abhinav Vemulapalli </strong>and <strong>Animesh Agrawal</strong>, Ph.D. student <strong>Debopam Sanyal</strong>, Associate Professor <strong>Alexey Tumanov</strong>, and Associate Professor <strong>Brendan Saltaformaggio</strong>.&nbsp;</p></div></div>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1769525518</created>  <gmt_created>2026-01-27 14:51:58</gmt_created>  <changed>1771522498</changed>  <gmt_changed>2026-02-19 17:34:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A newly discovered vulnerability could allow cybercriminals to silently hijack the artificial intelligence (AI) systems in self-driving cars, raising concerns about the security of autonomous systems increasingly used on public roads.]]></teaser>  <type>news</type>  <sentence><![CDATA[A newly discovered vulnerability could allow cybercriminals to silently hijack the artificial intelligence (AI) systems in self-driving cars, raising concerns about the security of autonomous systems increasingly used on public roads.]]></sentence>  <summary><![CDATA[<p>A newly discovered vulnerability could allow cybercriminals to silently hijack the artificial intelligence (AI) systems in self-driving cars, raising concerns about the security of autonomous systems increasingly used on public roads.</p><p>&nbsp;Georgia Tech cybersecurity researchers discovered the vulnerability, dubbed VillainNet, and found it can remain dormant in a self-driving vehicle’s AI system until triggered by specific conditions.</p><p>Once triggered, VillainNet is almost certain to succeed, giving attackers control of the targeted vehicle.</p>]]></summary>  <dateline>2026-01-27T00:00:00-05:00</dateline>  <iso_dateline>2026-01-27T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-01-27 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:jpopham3@gatech.edu">John Popham</a><br>Communications Officer II&nbsp;<br>School of Cybersecurity and Privacy</p><p>&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679102</item>      </media>  <hg_media>          <item>          <nid>679102</nid>          <type>image</type>          <title><![CDATA[Car-Blind-Spot.jpeg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Car-Blind-Spot.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/01/27/Car-Blind-Spot.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/01/27/Car-Blind-Spot.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/01/27/Car-Blind-Spot.jpeg?itok=pckjSeql]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A car's side view mirror with a alert in the center of the mirror. ]]></image_alt>                    <created>1769525530</created>          <gmt_created>2026-01-27 14:52:10</gmt_created>          <changed>1769525530</changed>          <gmt_changed>2026-01-27 14:52:10</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="182941"><![CDATA[cc-research; ic-cybersecurity; ic-hcc]]></keyword>          <keyword tid="175307"><![CDATA[Brendan Saltaformaggio]]></keyword>          <keyword tid="365"><![CDATA[Research]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="188667"><![CDATA[go-]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="145171"><![CDATA[Cybersecurity]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="687813">  <title><![CDATA[From Fusion to Self-Driving Cars, High Performance Computing and AI are Everywhere in 2026]]></title>  <uid>36319</uid>  <body><![CDATA[<p>While not as highlight-reel worthy as the Winter Olympics and the World Cup, experts expect high-performance computing (HPC) to have an even bigger impact on daily life in 2026.</p><p>Georgia Tech researchers say HPC and artificial intelligence (AI) advances this year are poised to improve how people power their homes, design safer buildings, and travel through cities.</p><p>According to&nbsp;<a href="https://tangqi.github.io/">Qi Tang</a>, scientists will take progressive steps toward cleaner, sustainable energy through nuclear fusion in 2026.&nbsp;</p><p>“I am very hopeful about the role of advanced computing and AI in making fusion a clean energy source,” said Tang, an assistant professor in the&nbsp;<a href="https://cse.gatech.edu/">School of Computational Science and Engineering (CSE)</a>.&nbsp;</p><p>“Fusion systems involve many interconnected processes happening across different scales. Modern simulations, combined with data-driven methods, allow us to bring these pieces together into a unified picture.”</p><p>Tang’s research connects HPC and machine learning with fusion energy and plasma physics. This year, Tang is continuing work on large-scale nuclear fusion models.</p><p>Only a few experimental fusion reactors exist worldwide compared to more than 400 nuclear fission reactors. Tang’s work supports a broader effort to turn fusion from a promising idea into a practical energy source.</p><p>Nuclear fusion occurs in plasma, the fourth state of matter, where gas is heated to millions of degrees. In this extreme state, electrons are stripped from atoms, creating a hot soup of fast-moving ions and free electrons. In plasma, hydrogen atoms overcome their natural electrical repulsion, collide, and fuse together. This releases energy that can power cities and homes.</p><p>Computers interpret extreme temperatures, densities, pressures, and plasma particle motion as massive datasets. Tang works to assimilate these data types from computer models and real-world experiments.</p><p>To do this, he and other researchers rely on machine learning approaches to analyze data across models and experiments more quickly and to produce more accurate predictions. Over time, this will allow scientists to test and improve fusion reactor designs toward commercial use.&nbsp;</p><p>Beyond energy and nuclear engineering,&nbsp;<a href="https://pk.linkedin.com/in/umarkhayaz">Umar Khayaz</a> sees broader impacts for HPC in 2026.</p><p>“HPC is the need of the day in every field of engineering sciences, physics, biology, and economics,” said Khayaz, a CSE Ph.D. student in the&nbsp;<a href="https://ce.gatech.edu/">School of Civil and Environmental Engineering</a>.&nbsp;</p><p>“HPC is important enough to say that we need to employ resources to also solve social problems.”</p><p>Khayaz studies dynamic fracture and phase-field modeling. These areas explore how materials break under sudden, rapid loads.&nbsp;</p><p>Like nuclear fusion, Khayaz says dynamic fracture problems are complex and data-intensive. In 2026, he expects to see more computing resources and computational capabilities devoted to understanding these problems and other emerging civil engineering challenges.</p><p>CSE Ph.D. student&nbsp;<a href="https://ahren09.github.io/">Yiqiao (Ahren) Jin</a> sees a similar relationship between infrastructure and self-driving vehicles. He believes AI will innovate this area in 2026.</p><p>At Georgia Tech, Jin develops efficient multimodal AI systems. An autonomous vehicle is a multimodal system that uses camera video, laser sensors, language instructions, and other inputs to navigate city streets under changing scenarios like traffic and weather patterns.</p><p>Jin says multimodal research will move beyond performance benchmarks this year. This shift will lead to computer systems that can reason despite uncertainty and explain their decisions. In result, engineers will redefine how they evaluate and deploy autonomous systems in safety-critical settings.</p><p>“Many foundational problems in perception, multimodal reasoning, and agent coordination are being actively addressed in 2026. These advances enable a transition from isolated autonomous systems to safer, coordinated autonomous vehicle fleets,” Jin said.&nbsp;</p><p>“As these systems scale, they have the potential to fundamentally improve transportation safety and efficiency.”</p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1769697057</created>  <gmt_created>2026-01-29 14:30:57</gmt_created>  <changed>1771516409</changed>  <gmt_changed>2026-02-19 15:53:29</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers say HPC and artificial intelligence (AI) advances this year are poised to improve how people power their homes, design safer buildings, and travel through cities.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers say HPC and artificial intelligence (AI) advances this year are poised to improve how people power their homes, design safer buildings, and travel through cities.]]></sentence>  <summary><![CDATA[<p>While not as highlight-reel worthy as the Winter Olympics and the World Cup, experts expect high-performance computing (HPC) to have an even bigger impact on daily life in 2026.</p><p>Georgia Tech researchers say HPC and artificial intelligence (AI) advances this year are poised to improve how people power their homes, design safer buildings, and travel through cities.</p>]]></summary>  <dateline>2026-01-29T00:00:00-05:00</dateline>  <iso_dateline>2026-01-29T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-01-29 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679125</item>      </media>  <hg_media>          <item>          <nid>679125</nid>          <type>image</type>          <title><![CDATA[CSE-in-2026_2.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[CSE-in-2026_2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/01/29/CSE-in-2026_2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/01/29/CSE-in-2026_2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/01/29/CSE-in-2026_2.jpg?itok=0wuKznLw]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[CSE in 2026]]></image_alt>                    <created>1769704332</created>          <gmt_created>2026-01-29 16:32:12</gmt_created>          <changed>1769704332</changed>          <gmt_changed>2026-01-29 16:32:12</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/fusion-self-driving-cars-high-performance-computing-and-ai-are-everywhere-2026]]></url>        <title><![CDATA[From Fusion to Self-Driving Cars, High Performance Computing and AI are Everywhere in 2026]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50877"><![CDATA[School of Computational Science and Engineering]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="142"><![CDATA[City Planning, Transportation, and Urban Growth]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="144"><![CDATA[Energy]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="142"><![CDATA[City Planning, Transportation, and Urban Growth]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="144"><![CDATA[Energy]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="172288"><![CDATA[School of Computational Science Engineering]]></keyword>          <keyword tid="167864"><![CDATA[School of Civil and Environmental Engineering]]></keyword>          <keyword tid="594"><![CDATA[college of engineering]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="15030"><![CDATA[high-performance computing]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="9167"><![CDATA[machine learning]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="194384"><![CDATA[Tech AI]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>          <term tid="39541"><![CDATA[Systems]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="687824">  <title><![CDATA[Cyber Risk is Business Risk: A Georgia Tech Alum on What Leaders Must Learn in 2026]]></title>  <uid>36253</uid>  <body><![CDATA[<p>When <strong>Christopher Craig</strong> arrived at Georgia Tech as an undergraduate in 1995, the campus and the field of cybersecurity looked very different.</p><p>“It was the era of look left and look right, and one of you will not be here at graduation,” Craig said.</p><p>Craig worked hard and graduated with his computer science (CS) bachelor’s degree in 2000, just as the dot-com bubble burst. He returned to Georgia Tech about a year later and has been here ever since.</p><p>Craig is the enterprise cybersecurity architect in the <a href="https://www.oit.gatech.edu/">Office of Information Technology</a> and has spent nearly three decades at Tech as a student, employee, and instructor.</p><p>Along the way, he has earned three degrees from the Institute and helped shape how Georgia Tech approaches cybersecurity in an increasingly complex digital landscape.</p><p>Craig began his career at Tech supporting student registration and other core IT systems. He moved fully into cybersecurity about 15 years ago. His technical background was strong, but he saw a gap in his experience.</p><p>“I had a lot of technical background and work experience, but not much policy experience,” he said.</p><p>Craig enrolled in Georgia Tech’s Master of Science in Information Security to fill in this gap. He said his decision to enroll in the policy track was intentional.</p><p>“If you’ve been doing the technical work for 10 years, a technical master’s helps some,” Craig said. “But it is much more useful to study the areas you do not already know well.”</p><p>Craig moved into management as his GT career progressed. This path led him once again to the classroom. This time, he pursued an MBA from Georgia Tech’s <a href="https://www.scheller.gatech.edu/index.html">Scheller College of Business</a>.</p><p>Craig believes the combination of cybersecurity and business education is increasingly important for leaders and others.</p><p>“There is a big gap in the industry,” he said. “You need people who understand cybersecurity and the business side, and people in business leadership who understand cybersecurity risk.”</p><p>Craig is an instructor in the online Master of Science in Cybersecurity program. He teaches incident response and often sees this gap among his students.</p><p>“Many business professionals do not know how to respond to a cybersecurity incident,” Craig said. “They are not trained in it. At the same time, many cybersecurity professionals are learning business impacts on the job.”</p><p>Craig said business knowledge is essential for aspiring chief information security officers.</p><p>“At that level, understanding how cybersecurity supports business goals is more important than deep technical detail,” he said. “You still need the basics, but you also need to talk to the CFO.”</p><p>At Georgia Tech, Craig focuses on cybersecurity architecture. His work centers on the design and protection of enterprise systems.</p><p>“For example, student information systems have a design,” he said. “We look at how firewalls and other controls fit into that design to protect the data.”</p><p>His role continues to evolve as the Institute’s cybersecurity needs change. That evolution mirrors the field itself, especially with the rise of artificial intelligence (AI).</p><p>“AI has impacted cybersecurity for longer than people want to admit,” Craig said. “Understanding what is unusual is a big part of security, and AI can be very good at that. It can also be very good at avoiding detection.”</p><p>Craig said AI introduces new architectural risks, particularly around data privacy. Tools that analyze student or employee data must be carefully designed to prevent sensitive information from leaking through training or outputs.</p><p>“You have to understand the inputs and outputs,” he said. “Otherwise, you can accidentally release data you really care about.”</p><p>Privacy has been a recurring theme throughout Craig’s career. He credits courses such as the privacy policy class taught by Professor <a href="https://peterswire.net/"><strong>Peter Swire</strong></a>, the J.Z. Liang Chair in the <a href="https://scp.cc.gatech.edu/">School of Cybersecurity and Privacy</a>, with shaping his thinking.</p><p>“So much of security is about personal data,” Craig said. “Understanding what actually makes data anonymous or not is critical.”</p><p>Craig believes that privacy protection depends on training and system design within an institution as large and decentralized as Georgia Tech.</p><p>“Training can only get you so far,” Craig said. “People make mistakes. Strong processes limit exposure even when human error happens.”</p><p>Looking back, Craig describes his time at Georgia Tech as one of constant growth.</p><p>“The industry has massively changed,” he said. “What you learn becomes outdated quickly. You have to keep growing.”</p><p>From undergraduate student to cybersecurity leader, Craig’s career reflects both the evolution of Georgia Tech and the fast-changing world of cybersecurity. For him, the learning never stops.</p>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1769704785</created>  <gmt_created>2026-01-29 16:39:45</gmt_created>  <changed>1771516387</changed>  <gmt_changed>2026-02-19 15:53:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech alum Christopher Craig’s nearly three-decade journey as a student, employee, and instructor shows how combining cybersecurity, policy, and business education is essential for leaders navigating evolving risks—from incident response to AI and ]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech alum Christopher Craig’s nearly three-decade journey as a student, employee, and instructor shows how combining cybersecurity, policy, and business education is essential for leaders navigating evolving risks—from incident response to AI and ]]></sentence>  <summary><![CDATA[<p>Georgia Tech alum Christopher Craig’s nearly three-decade journey as a student, employee, and instructor shows how combining cybersecurity, policy, and business education is essential for leaders navigating evolving risks—from incident response to AI and data privacy—in an increasingly complex digital landscape.</p>]]></summary>  <dateline>2026-01-28T00:00:00-05:00</dateline>  <iso_dateline>2026-01-28T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-01-28 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:jpopham3@gatech.edu">John Popham</a><br>Communications Officer II&nbsp;<br>School of Cybersecurity and Privacy&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679126</item>      </media>  <hg_media>          <item>          <nid>679126</nid>          <type>image</type>          <title><![CDATA[Christopher-Craig_1.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Christopher-Craig_1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/01/29/Christopher-Craig_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/01/29/Christopher-Craig_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/01/29/Christopher-Craig_1.jpg?itok=osts0quc]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A man looks up from his laptop computer and into a camera. There is a whiteboard with illegible writing on it behind him. ]]></image_alt>                    <created>1769704813</created>          <gmt_created>2026-01-29 16:40:13</gmt_created>          <changed>1769704813</changed>          <gmt_changed>2026-01-29 16:40:13</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="130"><![CDATA[Alumni]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="129"><![CDATA[Institute and Campus]]></category>      </categories>  <news_terms>          <term tid="130"><![CDATA[Alumni]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="129"><![CDATA[Institute and Campus]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="687406">  <title><![CDATA[Apple Vision Pro Powers New Wave of Immersive Education]]></title>  <uid>35272</uid>  <body><![CDATA[<div><div><div><div><div><p>Learning electrical and computer engineering has always come with a unique challenge: many of its foundational concepts — electric fields, magnetic forces, semiconductor behavior — are invisible to the naked eye and difficult to visualize.&nbsp;&nbsp;</p><p>To make these invisible principles tangible, students in the <a href="https://ece.gatech.edu/"><strong>School of Electrical and Computer Engineering</strong></a> have long used specialized tools and software. Circuit simulators model voltage and current, electromagnetic tools visualize fields, and semiconductor design platforms reveal transistor behavior. These tools turn abstract theory into interactive experiences that prepare students for real-world engineering challenges.</p></div></div></div></div></div><div><div><div><div><div><p>Now, Apple Vision Pro is joining this ecosystem.</p><p>The technology introduces spatial computing to learning environments, blending digital content with the physical world.</p><p>At the <a href="https://matter-systems.gatech.edu/"><strong>Institute for Matter and Systems</strong></a>, infrastructure lead <a href="https://research.gatech.edu/people/alex-gallmon"><strong>Alex Gallmon</strong></a>, is collaborating with students and industry partners to create immersive digital twins—virtual models that replicate real-world systems—of semiconductor cleanroom equipment.&nbsp;&nbsp;</p><p>“These machines are complex and costly, with parts that can run tens of thousands of dollars,” he said. “Even minor mistakes during operation can lead to expensive damage or downtime.”&nbsp;</p><p>Gallmon's team built a virtual replica of a cleanroom vacuum training system. The project serves as a prototype for a workforce development program aimed at high school and college students interested in careers in the semiconductor or vacuum technology fields.&nbsp;</p><p><a href="https://ece.gatech.edu/news/2026/01/apple-vision-pro-powers-new-wave-immersive-education">Read the full story from the School of Electrical and Computer Engineering</a></p></div></div></div></div></div>]]></body>  <author>aneumeister3</author>  <status>1</status>  <created>1768601610</created>  <gmt_created>2026-01-16 22:13:30</gmt_created>  <changed>1770143946</changed>  <gmt_changed>2026-02-03 18:39:06</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Spatial computing is transforming engineering education at Georgia Tech and opening new paths for entrepreneurship and technical training.]]></teaser>  <type>news</type>  <sentence><![CDATA[Spatial computing is transforming engineering education at Georgia Tech and opening new paths for entrepreneurship and technical training.]]></sentence>  <summary><![CDATA[<div><div><p>Spatial computing is transforming engineering education at Georgia Tech and opening new paths for entrepreneurship and technical training.</p></div></div>]]></summary>  <dateline>2026-01-12T00:00:00-05:00</dateline>  <iso_dateline>2026-01-12T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-01-12 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[dwatson@ece.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:dwatson@ece.gatech.edu">Dan Watson </a>| School of Electrical and Computer Engineering</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679037</item>          <item>679038</item>      </media>  <hg_media>          <item>          <nid>679037</nid>          <type>image</type>          <title><![CDATA[Apple-VR-Headset-002.jpeg]]></title>          <body><![CDATA[<p>Georgia Tech student Yash Rajgure using an Apple Vision Pro headset device to demo his team's project in ECE 6001 Technology Entrepreneurship: Teaming, Ideation, and Entrepreneurship. <em>Photo: Allison Carter, Georgia Tech</em></p>]]></body>                      <image_name><![CDATA[Apple-VR-Headset-002.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/01/16/Apple-VR-Headset-002.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/01/16/Apple-VR-Headset-002.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/01/16/Apple-VR-Headset-002.jpeg?itok=4oJ4Rpb7]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Georgia Tech student Yash Rajgure using an Apple Vision Pro headset device to demo his team's project.]]></image_alt>                    <created>1768601620</created>          <gmt_created>2026-01-16 22:13:40</gmt_created>          <changed>1768601620</changed>          <gmt_changed>2026-01-16 22:13:40</gmt_changed>      </item>          <item>          <nid>679038</nid>          <type>image</type>          <title><![CDATA[Gammon-Vision-Pro_1.jpeg]]></title>          <body><![CDATA[<div><div><div><div><div><div><p>Gallmon showing how Apple Vision Pro can be utilized to train students and workers on sensitive and expensive technical equipment, in this case a cleanroom vacuum system.</p></div></div></div></div></div></div>]]></body>                      <image_name><![CDATA[Gammon-Vision-Pro_1.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/01/16/Gammon-Vision-Pro_1.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/01/16/Gammon-Vision-Pro_1.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/01/16/Gammon-Vision-Pro_1.jpeg?itok=iAy04qBz]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Alex Gallmon showing how Apple Vision Pro can be utilized]]></image_alt>                    <created>1768601620</created>          <gmt_created>2026-01-16 22:13:40</gmt_created>          <changed>1768601620</changed>          <gmt_changed>2026-01-16 22:13:40</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="660369"><![CDATA[Matter and Systems]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>          <category tid="194612"><![CDATA[Workforce Development]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>          <term tid="194612"><![CDATA[Workforce Development]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="193652"><![CDATA[Matter and Systems]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="687358">  <title><![CDATA[New LLMs Could Provide Strength-based Job Coaching for Autistic People]]></title>  <uid>36530</uid>  <body><![CDATA[<p>People with autism seeking employment may soon have access to a new AI-based job-coaching tool thanks to a six-figure grant from the National Science Foundation (NSF).</p><p><a href="https://www.cc.gatech.edu/people/jennifer-kim"><strong>Jennifer Kim</strong></a> and&nbsp;<a href="https://eilab.gatech.edu/mark-riedl.html"><strong>Mark Riedl</strong></a> recently received a $500,000 NSF grant to develop large language models (LLMs) that provide strength-based job coaching for autistic job seekers.&nbsp;</p><p>The two Georgia Tech researchers work with&nbsp;<a href="https://excel.gatech.edu/excel-staff/heather-dicks"><strong>Heather Dicks</strong></a>, a career development advisor in Georgia Tech’s EXCEL program, and other nonprofit organizations to provide job-seeking resources to autistic people.</p><p>Dicks said the average job search for people with autism can take three to six months in a good economy. It can take up to 18 months in a bad one. However, the new LLMs from Georgia Tech could help to reduce stress and fast-track these job seekers into employment.</p><p>Kim is an assistant professor who specializes in human-computer interaction technology that benefits neurodivergent people. Riedl is a professor and an expert in the development of artificial intelligence (AI) and machine learning technologies.</p><p>The team’s goal is to identify job-search pain points and understand how job coaches create better employment prospects for their autistic clients.</p><p>“Large-language models have an opportunity to support this kind of work if we can have more data about each different individual strength,” Kim said.</p><p>“We want to know what worked for them in specific settings at work, what didn’t work, and what kind of accommodations can better help them. That includes how they should prepare for interviews, how they can better represent their skills, how they can address accommodations they need, and how to write a cover letter. It’s a broad range.”</p><p>Dicks has advocated for neurodivergent people and helped them find employment for 20 years. She worked at the Center for the Visually Impaired in Atlanta before coming to Georgia Tech in 2017.</p><p>She said most nonprofits that support neurodivergent people offer career development programs and many contract job coaches, but limited coach availability often leads to long waitlists. However, LLMs could fill this availability gap to address the immediate needs of job seekers who may not have access to a job coach.</p><p>“These organizations often run at a slow pace, and there’s high turnover,” Dicks said. “An AI tool could get the job seeker quicker support. Maybe they don’t even need to wait on the government system.</p><p>“If they’re on a waitlist, it can help the user put together a resume and practice general interview questions. When the job coach is ready to work with them, they’re able to hit the ground running.”</p><h4><strong>Nailing the Interview</strong></h4><p>Dicks said the job interview is one of the biggest challenges for people with autism.</p><p>“They have trouble picking up on visual and nonverbal cues — the tone of the interview, figuring out the nuances that a question is hinting at,” she said. “They’re not giving the warm and fuzzy vibes that allow them to connect on a personal level.”</p><p>That’s why Kim wants the models to reflect a strength-based coaching approach. Strength-based coaching is particularly effective for individuals with autism. Many possess traits that employers value. These include:</p><ul><li>Close attention to detail</li><li>Strong technical proficiency</li><li>Unique problem-solving perspectives</li></ul><p>“The issue is that they don’t know how these strengths can be applied in the workplace,” Kim said. “Once they understand this, they can communicate with employers about their strengths and the accommodations employers should provide to the job seeker so they can successfully apply their skills at work.”</p><h4><strong>Handling Rejection</strong></h4><p>Still, Kim understands that candidates will need to handle rejection to make it through the search process. She envisions LLMs that help them refocus their energy and regain their confidence after being turned down.</p><p>“When you get a lot of rejection emails, it’s easy to feel you’re not good enough,” she said. “Being constantly reminded about your strengths and their prior successes can get them through the stressful job-seeking process.”</p><p>Dicks said the models should also be able to provide feedback so that candidates don’t repeat mistakes.</p><p>“It can tell them what would’ve been a better answer or a better way to say it,” Dicks said. “It can also encourage them with reminders that you get 100 noes before you get a yes.”</p><h4><strong>You’re Hired, Now What?</strong></h4><p>Dicks said the role of a job coach doesn’t end the moment a client is hired. Government-contracted job coaches may work with their clients for up to 90 days after they start a new job to support their transition.</p><p>However, she said, sometimes that isn’t enough. Many companies have probationary periods exceeding three months. Autistic individuals may struggle with on-the-job training or communicating what accommodations they need from their new employer.&nbsp;</p><p>These are just a few gaps an AI tool can fill for these individuals after they’re hired.</p><p>“I could see these models evolving to being supportive at those critical junctures of the probationary period being over or the one-year job review or the annual evaluation that everyone dreads,” she said.</p><p>Dicks has an average caseload of 15 students, whom she assists in landing jobs and internships through the EXCEL program.</p><p>EXCEL provides a mentorship program for students with intellectual and developmental disabilities from the time they set foot on campus through graduation and beyond.</p><p>For more information and to apply, visit EXCEL’s&nbsp;<a href="https://excel.gatech.edu/home"><strong>website</strong></a>.</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1768503844</created>  <gmt_created>2026-01-15 19:04:04</gmt_created>  <changed>1769089269</changed>  <gmt_changed>2026-01-22 13:41:09</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers are using an NSF grant to create new large-language models that help autistic job seekers understand their strengths and how to leverage them during the application process.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers are using an NSF grant to create new large-language models that help autistic job seekers understand their strengths and how to leverage them during the application process.]]></sentence>  <summary><![CDATA[<p>Georgia Tech researchers are using an NSF grant to create new large-language models that help autistic job seekers understand their strengths and how to leverage them during the application process.</p>]]></summary>  <dateline>2026-01-15T00:00:00-05:00</dateline>  <iso_dateline>2026-01-15T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-01-15 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679012</item>      </media>  <hg_media>          <item>          <nid>679012</nid>          <type>image</type>          <title><![CDATA[Jennifer-Kim_86A4154-copy.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Jennifer-Kim_86A4154-copy.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/01/15/Jennifer-Kim_86A4154-copy.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/01/15/Jennifer-Kim_86A4154-copy.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/01/15/Jennifer-Kim_86A4154-copy.jpg?itok=yyxFubXO]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Jennifer Kim]]></image_alt>                    <created>1768503854</created>          <gmt_created>2026-01-15 19:04:14</gmt_created>          <changed>1768503854</changed>          <gmt_changed>2026-01-15 19:04:14</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="42901"><![CDATA[Community]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="42901"><![CDATA[Community]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>      </news_terms>  <keywords>          <keyword tid="6053"><![CDATA[Autism]]></keyword>          <keyword tid="191680"><![CDATA[neurodiverse]]></keyword>          <keyword tid="780"><![CDATA[employment]]></keyword>          <keyword tid="174112"><![CDATA[excel program]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="193556"><![CDATA[large language models]]></keyword>          <keyword tid="7011"><![CDATA[NSF grant]]></keyword>          <keyword tid="6957"><![CDATA[Job Search]]></keyword>          <keyword tid="13786"><![CDATA[job search strategies]]></keyword>          <keyword tid="194701"><![CDATA[go-resarchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71901"><![CDATA[Society and Culture]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="687371">  <title><![CDATA[Georgia Tech Wins Fifth Straight NSA Codebreaker Challenge]]></title>  <uid>36253</uid>  <body><![CDATA[<div><div><p>The United States Air Force's Cyber Operations Squadron was in a crisis. A sophisticated foreign adversary was threatening national security, and it was up to the National Security Agency to help.&nbsp;</p><p>This was the fictional <a href="https://nsa-codebreaker.org/challenge">scenario</a> of the <a href="https://nsa-codebreaker.org/leaderboard">2025 NSA Codebreaker Challenge</a>, which was once again dominated by Georgia Tech students, faculty, and alumni. With a score of nearly 300,000 points, they took first among Division I schools.&nbsp;</p><p>“Georgia Tech continues to win this highly challenging competition each year because of our outstanding students and the excellence of the cybersecurity and privacy curriculum that has been developed by SCP faculty,” said Mustaque Ahamad, Interim Chair for the School of Cybersecurity and Privacy.</p><p>“Our courses provide not only foundational knowledge of the discipline, but also give students experience with tools and techniques that help them shine at this competition.”</p><p>One of the keys to Georgia Tech’s success is that it integrates the challenge into students’ coursework. Professor Taesoo Kim has included it in his <a href="https://omscs.gatech.edu/cs-6265-information-security-lab"><em>CS 6265: Information Security Lab</em></a> every year to give students real-life experience.&nbsp;</p><p>“The NSA Codebreaker Challenge highlights the strength of Georgia Tech’s cybersecurity program and the hands-on, mission-driven training our students receive. Through courses like CS 6265 and others like it, students apply advanced security concepts to real-world problems, reinforcing Georgia Tech’s long-standing excellence and leadership in cybersecurity education.”</p><p>This year was the first time the NSA broke the Codebreaker Challenge for colleges and universities into divisions based on the number of participants. The winners of divisions one, two, and three were considered the winners of the challenge. Georgia Tech was in the top division with 272 students, four instructors, 27 alumni, and two in the other category. The Institute had a total of 305 participants, the second largest in the competition.&nbsp;</p><p>The NSA Codebreaker Challenge is open to anyone with an email address from a recognized U.S. school or university. All players register and log in individually. Students, professors, and alumni can participate, but only students earn points and awards.</p></div></div>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1768571402</created>  <gmt_created>2026-01-16 13:50:02</gmt_created>  <changed>1769089245</changed>  <gmt_changed>2026-01-22 13:40:45</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The 2025 NSA Codebreaker Challenge, which was once again dominated by Georgia Tech students, faculty, and alumni.]]></teaser>  <type>news</type>  <sentence><![CDATA[The 2025 NSA Codebreaker Challenge, which was once again dominated by Georgia Tech students, faculty, and alumni.]]></sentence>  <summary><![CDATA[<p>The 2025 NSA Codebreaker Challenge, which was once again dominated by Georgia Tech students, faculty, and alumni. With a score of nearly 300,000 points, they took first among Division I schools.&nbsp;</p>]]></summary>  <dateline>2026-01-15T00:00:00-05:00</dateline>  <iso_dateline>2026-01-15T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-01-15 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:mailto;jpopham3@gatech.edu">John Popham</a>&nbsp;<br>Communications Officer II&nbsp;<br>School of Cybersecurity and Privacy</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>      </media>  <hg_media>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="193158"><![CDATA[Student Competition Winners (academic, innovation, and research)]]></category>          <category tid="193157"><![CDATA[Student Honors and Achievements]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="193158"><![CDATA[Student Competition Winners (academic, innovation, and research)]]></term>          <term tid="193157"><![CDATA[Student Honors and Achievements]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="687534">  <title><![CDATA[New Cryogenic Vacuum Chamber Cuts Noise for Quantum Ion Trapping]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Even very slight environmental noise, such as microscopic vibrations or magnetic field fluctuations a hundred times smaller than the Earth’s magnetic field, can be catastrophic for quantum computing experiments with trapped ions.<br>&nbsp;</p><p>To address that challenge, researchers at the Georgia Tech Research Institute (GTRI) have developed an improved cryogenic vacuum chamber that helps reduce some common noise sources by isolating ions from vibrations and shielding them from magnetic field fluctuations. The new chamber also incorporates an improved imaging system and a radio frequency (RF) coil that can be used to drive ion transitions from within the chamber.&nbsp;<br>&nbsp;</p><p>“There’s a lot of excitement around quantum computing today, and trapped ions are just one of the research platforms available, each with their own benefits and drawbacks,” explained Darian Hartsell, a GTRI research scientist who leads the project. “We are trying to mitigate multiple sources of noise in this chamber and make other improvements with one robust new design.”<br>&nbsp;</p><p>The chamber design is described in a paper published January 20, 2026 in the journal <em>Applied Physics Letters</em>. Some of the technical improvements developed for the project are already being applied at GTRI and collaborating organizations. This work was done in collaboration with Los Alamos National Laboratory.<br>&nbsp;</p><p>The goal of the vibration isolation is to reduce the laser amplitude and phase noise when addressing the ions, increasing operation fidelity. The goal of the magnetic field noise reduction is to preserve the coherence of qubits for longer periods of time so researchers can use them for more complex algorithms.</p><p><a href="https://www.gtri.gatech.edu/newsroom/new-cryogenic-vacuum-chamber-cuts-noise-quantum-ion-trapping">See the complete article on the GTRI news site</a></p><p><br>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1769010999</created>  <gmt_created>2026-01-21 15:56:39</gmt_created>  <changed>1769011387</changed>  <gmt_changed>2026-01-21 16:03:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have developed an improved vacuum chamber that reduces noise for quantum ion trapping research.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have developed an improved vacuum chamber that reduces noise for quantum ion trapping research.]]></sentence>  <summary><![CDATA[<p>Researchers have developed an improved vacuum chamber that reduces noise for quantum ion trapping research.</p>]]></summary>  <dateline>2026-01-21T00:00:00-05:00</dateline>  <iso_dateline>2026-01-21T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-01-21 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Chamber also incorporates improved imaging]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[gtri.media@gtri.gatech.edu]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679046</item>      </media>  <hg_media>          <item>          <nid>679046</nid>          <type>image</type>          <title><![CDATA[Researcher tests improved vacuum chamber for ion trapping]]></title>          <body><![CDATA[<p>GTRI Research Scientist Darian Hartsell makes adjustments to an improved cryogenic vacuum chamber that helps reduce some common noise sources by isolating ions from vibrations and shielding them from magnetic field fluctuations. (Credit: Sean McNeil, GTRI)</p>]]></body>                      <image_name><![CDATA[Vacuum-Chamber_06.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/01/21/Vacuum-Chamber_06.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/01/21/Vacuum-Chamber_06.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/01/21/Vacuum-Chamber_06.jpg?itok=1sLg1m0_]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researcher tests improved vacuum chamber]]></image_alt>                    <created>1769010196</created>          <gmt_created>2026-01-21 15:43:16</gmt_created>          <changed>1769010565</changed>          <gmt_changed>2026-01-21 15:49:25</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="193653"><![CDATA[Georgia Tech Research Institute]]></term>          <term tid="193652"><![CDATA[Matter and Systems]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686615">  <title><![CDATA[Researchers Look to Maker Safer AI Through Google Awards]]></title>  <uid>36530</uid>  <body><![CDATA[<p>People seeking mental health support are increasingly turning to large language models (LLMs) for advice.&nbsp;</p><p>However, most popular AI-powered chatbots are not trained to recognize when someone is in crisis. LLMs also cannot determine when to refer someone to a human specialist.</p><p>New Georgia Tech research projects that address these issues may soon provide people seeking mental health support with safer experiences.&nbsp;</p><p>Google has awarded research grants to three faculty members from the School of Interactive Computing to study artificial intelligence (AI), trust, safety, and security. The grants were among dozens awarded by the company to researchers across the country.</p><p>Professor <a href="http://www.munmund.net/"><strong>Munmun De Choudhury</strong></a>, Associate Professor <a href="https://sites.google.com/view/riarriaga/home"><strong>Rosa Arriaga</strong></a>, and Associate Professor <a href="https://aritter.github.io/"><strong>Alan Ritter</strong></a> are among the recipients of the <a href="https://research.google/programs-and-events/google-academic-research-awards/google-academic-research-award-program-recipients/"><strong>2025 Google Academic Research Awards</strong></a>.&nbsp;</p><p>Their projects will explore questions like:</p><ul><li>What harms could occur if people consult LLMs for mental health advice?</li><li>Which groups are most at risk of receiving harmful guidance?</li><li>When should an LLM stop responding and refer someone to a human professional?</li></ul><p>De Choudhury and Arriaga will examine how LLMs might harm people seeking mental health care.</p><p>De Choudhury’s work focuses on spotting when chatbot conversations go wrong and lead users toward self-harm. She is also studying design changes that could prevent these situations.</p><p>Her project,&nbsp;<em>Exiting Harmful Reliance: Identifying Crises &amp; Care Escalation Needs</em>, is in partnership with Angel Hsing-Chi Hwang from the University of Southern California. Together, they will review real and synthetic chat transcripts with clinicians to find language patterns that signal risk.</p><p>“A chatbot will always give a response and keep talking to you for however long you want,” De Choudhury said. “That may not be a good thing for someone in crisis. We need to know when the right response is to stop and suggest talking to a human.”</p><p>&nbsp;</p><h4><strong>Understanding Risks for Low-Income Users</strong></h4><p>Arriaga’s project,&nbsp;<em>Dull, Dirty, Dangerous: Investigating Trust of Digital Resources Among Low-SES Mental Health Care Seekers</em>, looks at how LLMs affect people with low socioeconomic status (SES).</p><p>Dull, dirty, and dangerous is a phrase used to describe work that is well-suited for robot automation because they are repetitive, physically taxing, or hazardous for humans. Arriaga said she adapted these terms for her research to create a taxonomy of the harms AI can cause to people seeking mental health care.</p><p>Arriaga also wants to label the trust factors that chatbots have that attract low-SES users to seek their advice, and how these may differ for adults and adolescents across contexts.&nbsp;</p><p>“We know one of the reasons some users go to LLMs is because they aren’t insured and can’t afford a therapist,” she said. “LLMs are available 24-7. Maybe it doesn’t start as a trust issue. Maybe it starts with availability.&nbsp;</p><p>“Some of these human-AI conversations that result in harmful mental health advice didn’t begin on the topic of mental health. In one case, the person started going to the machine for help with homework.</p><p>“Then this relationship evolved into personal matters. Should we constrain the system to limit itself to helping someone with their homework and not wander off that subject into mental health matters?”</p><p>&nbsp;</p><h4><strong>Managing Privacy Risks for Social Media</strong></h4><p>Ritter will use the Google award to advance research on social media privacy tools, including interactive AI agents that help people make more informed decisions about what they share online.</p><p>His project, <em>AI Tools to Help Users Make Informed Decisions About Online Information Sharing</em>, focuses on reducing privacy risks in both text and images by identifying when posts reveal more than users intend.</p><p>“We’ve been developing methods to assess risks in text, and now we’re extending that work to images,” Ritter said. “People post photos without realizing how easily they can be geolocated by advanced AI systems. A casual selfie near home might contain subtle cues about where you live, like a street sign, that reveal private details.”</p><p>The project aims to create AI agents that review content within user posts, flag elements that pose risk, and suggest safer alternatives. Ritter said he wants people to maintain control over their privacy without limiting freedom of expression.</p><p>Ritter will deploy advanced reasoning models capable of probabilistic privacy estimation. These systems can infer how identifiable a piece of text might be or how likely an image is to reveal a user’s location.</p><p>For images, Ritter and his collaborators will use models that identify geolocatable features, allowing users to edit or hide them before posting.</p><p>For more on Ritter’s research,&nbsp;<a href="https://www.cc.gatech.edu/news/new-large-language-model-can-protect-social-media-users-privacy"><strong>read how an LLM he co-developed protects the privacy of users on social media.</strong></a></p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1764016112</created>  <gmt_created>2025-11-24 20:28:32</gmt_created>  <changed>1767965901</changed>  <gmt_changed>2026-01-09 13:38:21</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Three Georgia Tech faculty members received Google Academic Research Awards to study how to make AI safer.]]></teaser>  <type>news</type>  <sentence><![CDATA[Three Georgia Tech faculty members received Google Academic Research Awards to study how to make AI safer.]]></sentence>  <summary><![CDATA[<p>Three Georgia Tech faculty members from the School of Interactive Computing received Google Academic Research Awards to study how to make AI safer, focusing on minimizing harm to users seeking <strong>mental health support</strong> from large language models (LLMs) and improving <strong>social media privacy</strong> tools.</p>]]></summary>  <dateline>2025-11-24T00:00:00-05:00</dateline>  <iso_dateline>2025-11-24T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-11-24 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678716</item>      </media>  <hg_media>          <item>          <nid>678716</nid>          <type>image</type>          <title><![CDATA[437249_Google-Research-Award-Graphic.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[437249_Google-Research-Award-Graphic.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/11/24/437249_Google-Research-Award-Graphic.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/11/24/437249_Google-Research-Award-Graphic.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/11/24/437249_Google-Research-Award-Graphic.jpg?itok=qXR59Azs]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Google Research Awards]]></image_alt>                    <created>1764016128</created>          <gmt_created>2025-11-24 20:28:48</gmt_created>          <changed>1764016128</changed>          <gmt_changed>2025-11-24 20:28:48</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="194701"><![CDATA[go-resarchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="193860"><![CDATA[Artifical Intelligence]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="192524"><![CDATA[ChatGPT]]></keyword>          <keyword tid="184554"><![CDATA[Google Research Award]]></keyword>          <keyword tid="167007"><![CDATA[health &amp; well-being]]></keyword>          <keyword tid="10343"><![CDATA[mental health]]></keyword>          <keyword tid="169137"><![CDATA[chatbot]]></keyword>          <keyword tid="167543"><![CDATA[social media]]></keyword>          <keyword tid="114791"><![CDATA[Data Privacy]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>          <topic tid="71901"><![CDATA[Society and Culture]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686197">  <title><![CDATA[New Software Center Director to Lead Next Wave of Scientific Discovery]]></title>  <uid>36319</uid>  <body><![CDATA[<p>Scientists across Georgia Tech rely on powerful software tools to propel breakthroughs in fields ranging from physics to biology. Now, software experts who make that research possible are gaining a new leader.&nbsp;</p><p>The College of Computing named Professor&nbsp;<a href="https://vuduc.org/v2/">Rich Vuduc</a> as director of the Center for Scientific Software Engineering (<a href="https://ssecenter.cc.gatech.edu/">CSSE</a>). The Georgia Tech hub is dedicated to building reliable, high-performance software for scientists. &nbsp;</p><p>Under Vuduc’s leadership, CSSE strives to accelerate the pace and increase the quality of scientific discovery by developing custom software tools and best practices tailored to researchers’ needs.</p><p>“There is a reproducibility and reliability problem right now with scientific software,” Vuduc said. “The promise of CSSE is to leverage capabilities shared between Georgia Tech, Schmidt Sciences, and industry experts to address this problem.”&nbsp;</p><p>Issues arise because scientists often need to develop their own software for experiments or data analysis. However, troubleshooting coding issues and other bugs can slow down research.</p><p>To assist these scientists, CSSE receives their input to create custom software tools and best practices. The center employs professional software engineers who build and deliver products tailor-made to the needs of researchers at Georgia Tech and broader scientific communities.</p><p>Beyond its research focus, CSSE helps Georgia Tech fulfill its educational mission. The center provides students with direct access and exposure to real-world software engineering.</p><p>As the center enters its third year, Vuduc wants to better prepare students for employment by enhancing their hands-on experience while learning from CSSE engineers.</p><p>To achieve this goal, Vuduc is working to establish a <a href="https://gatech.infoready4.com/#competitionDetail/1999204">Ph.D. fellowship program</a> in which CSSE engineers mentor students. This program would connect academic inquiry with industry expertise, creating the next generation of dynamic leaders in computational science. &nbsp;</p><p>Vuduc also envisions pairing CSSE with Georgia Tech’s&nbsp;<a href="https://vip.gatech.edu/">Vertically Integrated Projects (VIP) program</a>. This approach would allow undergraduate students to earn class credit while working with CSSE engineers on large software engineering projects spanning multiple semesters.</p><p>“The center gives our students access to something that is very unique to find in a university environment,” Vuduc said.&nbsp;</p><p>“The software engineers in CSSE mostly come from industry. They have over 65 years of combined experience doing real-world software engineering that students can learn from.”</p><p>Vuduc is a 2010 recipient of the&nbsp;<a href="https://awards.acm.org/bell">Gordon Bell Prize</a> and a leading expert in high-performance computing (HPC). He was a finalist for the award in 2020 and 2022.</p><p>The Gordon Bell Prize, often referred to as the Nobel Prize in supercomputing due to the scope and magnitude of research it recognizes, celebrates achievement in HPC research and application.&nbsp;</p><p>Vuduc joined Georgia Tech in 2007 as one of the first faculty hired for the new Division of Computational Science and Engineering (CSE). Not a stranger of leading new units, he saw CSE begin offering M.S. and Ph.D. degrees in 2008 and&nbsp;<a href="https://cse.gatech.edu/founding-school">attain school status in 2010</a>. &nbsp;</p><p>Since 2021, Vuduc has served as co-director of the Center for Research into Novel Computing Hierarchies (<a href="https://crnch.gatech.edu/">CRNCH</a>).&nbsp;</p><p>CRNCH is an interdisciplinary research center at Georgia Tech that explores technologies and approaches that will usher the next generation of computing. Areas CRNCH studies include quantum computing, brain-inspired computing, and approximate computing.&nbsp;</p><p>Vuduc will step down as CRNCH co-director to fulfill his role as CSSE director. The College of Computing will lead a search for CRNCH’s next co-director.</p><p>“In a sense, the CRNCH to CSSE transition was partly a natural one because one thing that contributes to software challenges is that hardware platforms are also changing and evolving very rapidly,” said Vuduc.&nbsp;</p><p>“People are exploring radically new hardware systems and we will have to write software configured for those too. Centers, like CRNCH and CSSE, strongly position Georgia Tech to lead these endeavors.”&nbsp;</p><p><strong>Alessandro (Alex) Orso</strong>, the previous CSSE director, departed Georgia Tech earlier this year to become&nbsp;<a href="https://news.uga.edu/alex-orso-named-dean-of-ugas-college-of-engineering/">dean of the University of Georgia’s College of Engineering</a>. Orso and Distinguished Professor <strong>Irfan Essa</strong> wrote the proposal to bring CSSE to Georgia Tech.</p><p>Georgia Tech formed CSSE in 2022 after securing an $11 million grant from&nbsp;<a href="https://www.schmidtsciences.org/">Schmidt Sciences</a>. Former Google CEO Eric Schmidt and his spouse, Wendy Schmidt, founded the philanthropic venture that funds science and technology research and talent networking programs.&nbsp;</p><p>Georgia Tech’s CSSE is part of Schmidt Sciences’&nbsp;<a href="https://www.schmidtsciences.org/viss/">Virtual Institute for Scientific Software (VISS) program</a>. This network helps scientists obtain more robust, flexible, scalable open-source software.&nbsp;</p><p>Schmidt Sciences is investing $40 million in VISS over five years at four universities: Georgia Tech, University of Washington, Johns Hopkins University, and University of Cambridge.</p><p>CSSE uses the funding to employ a software engineering lead, three senior and two junior software engineers. The Schmidt Sciences grant equips these engineers with computing resources to build scientific software. Along with the director, an advisory board guides the group’s work to meet the point of need for scientists in the field.&nbsp;</p><p>“I am grateful to Schmidt Sciences for their support of CSSE. It aligns with our college’s strategic goals and expertise in scientific software, and I am delighted that Rich has agreed to take on this important role,” said Vivek Sarkar, Dean and John P. Imlay Jr. Chair of Computing.</p><p>“I know that Rich is committed to growing CSSE's internal and external visibility and long-term sustainability. I am confident that he will also help further socialize CSSE among internal stakeholders across Georgia Tech.”</p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1762351306</created>  <gmt_created>2025-11-05 14:01:46</gmt_created>  <changed>1767965887</changed>  <gmt_changed>2026-01-09 13:38:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The College of Computing named Professor Rich Vuduc as director of the Center for Scientific Software Engineering (CSSE). The Georgia Tech hub is dedicated to building reliable, high-performance software for scientists.  ]]></teaser>  <type>news</type>  <sentence><![CDATA[The College of Computing named Professor Rich Vuduc as director of the Center for Scientific Software Engineering (CSSE). The Georgia Tech hub is dedicated to building reliable, high-performance software for scientists.  ]]></sentence>  <summary><![CDATA[<p>Scientists across Georgia Tech rely on powerful software tools to propel breakthroughs in fields ranging from physics to biology. Now, software experts who make that research possible are gaining a new leader.&nbsp;</p><p>The College of Computing named Professor&nbsp;<a href="https://vuduc.org/v2/">Rich Vuduc</a> as director of the Center for Scientific Software Engineering (<a href="https://ssecenter.cc.gatech.edu/">CSSE</a>). The Georgia Tech hub is dedicated to building reliable, high-performance software for scientists. &nbsp;</p><p>Under Vuduc’s leadership, CSSE strives to accelerate the pace and increase the quality of scientific discovery by developing custom software tools and best practices tailored to researchers’ needs.</p>]]></summary>  <dateline>2025-11-03T00:00:00-05:00</dateline>  <iso_dateline>2025-11-03T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-11-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678546</item>      </media>  <hg_media>          <item>          <nid>678546</nid>          <type>image</type>          <title><![CDATA[Vuduc-CSSE-Director.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Vuduc-CSSE-Director.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/11/05/Vuduc-CSSE-Director.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/11/05/Vuduc-CSSE-Director.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/11/05/Vuduc-CSSE-Director.jpg?itok=FlGBpo2o]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Rich Vuduc CSSE Director]]></image_alt>                    <created>1762351373</created>          <gmt_created>2025-11-05 14:02:53</gmt_created>          <changed>1762351373</changed>          <gmt_changed>2025-11-05 14:02:53</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/new-software-center-director-lead-next-wave-scientific-discovery]]></url>        <title><![CDATA[New Software Center Director to Lead Next Wave of Scientific Discovery]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50877"><![CDATA[School of Computational Science and Engineering]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="172288"><![CDATA[School of Computational Science Engineering]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="183717"><![CDATA[Center for Research into Novel Computing Hierarchies]]></keyword>          <keyword tid="15030"><![CDATA[high-performance computing]]></keyword>          <keyword tid="170965"><![CDATA[software engineering]]></keyword>          <keyword tid="194841"><![CDATA[Center for Scientific Software Engineering]]></keyword>      </keywords>  <core_research_areas>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686843">  <title><![CDATA[NSF Grant Funds Protein Research for Drug Discovery and Personalized Medicine]]></title>  <uid>36319</uid>  <body><![CDATA[<p>Proteins, including antibodies, hemoglobin, and insulin, power nearly every vital aspect of life. Breakthroughs in protein research are producing vaccines, resilient crops, bioenergy sources, and other innovative technologies.</p><p>Despite their importance, most of what scientists know about proteins only comes from a small sample size. This stands in the way of fully understanding how most proteins work and unlocking their full potential.</p><p>Georgia Tech’s <a href="https://faculty.cc.gatech.edu/~yunan/">Yunan Luo</a> believes artificial intelligence (AI) could fill this knowledge gap. The National Science Foundation agrees. Luo is the recipient of an NSF Faculty Early Career Development (<a href="https://www.nsf.gov/funding/opportunities/career-faculty-early-career-development-program">CAREER</a>) award.&nbsp;</p><p>“So much of biology depends on knowing what proteins do, but decades of research have concentrated on a relatively small set of well-studied proteins. This imbalance in scientific attention leads to a distorted view of the biological landscape that&nbsp;quietly shapes our data and our algorithms,” Luo said.</p><p>“My group’s goal is to build machine learning (ML) models that actively close this gap by generating trustworthy&nbsp;function predictions for the many proteins that remain understudied.”</p><p>[Related: <a href="https://www.cc.gatech.edu/news/faculty-use-ai-protein-design-and-discovery-support-18-million-nih-grant">Yunan Luo to use AI for Protein Design and Discovery with Support of $1.8 Million NIH Grant</a>]</p><p>In his <a href="https://www.nsf.gov/awardsearch/show-award/?AWD_ID=2442063&amp;HistoricalAwards=false">proposal to NSF</a>, Luo coined this rich-get-richer effect “annotation inequality.”&nbsp;</p><p>One problem of annotation inequality is that it slows progress in disease prognosis, drug discovery, and other critical biomedical areas. It is challenging to innovate the few proteins that scientists already know so much about.&nbsp;</p><p>A cascading effect of annotation inequality is that it diminishes the effectiveness of studying proteins with&nbsp;AI. &nbsp;</p><p>AI methods learn from existing experimental data. Datasets skewed toward well-known proteins propagate and become entrenched in models. Over time, this makes it harder for computers to research understudied proteins.&nbsp;</p><p>“Protein annotation inequality creates an effect analogous to a vast library where 95% of patrons only read the top 5% popular books, leaving the rest of the collection to gather dust,” Luo said.</p><p>“This has resulted in knowledge disparities across proteins in current literature and databases, biasing our understanding of protein functions.”</p><p>The NSF CAREER award will fund Luo with over $770,000 for the next five years to tackle head-on the problem of protein annotation inequality.</p><p>Luo will use the grant to build an accurate, unbiased protein function prediction framework at scale. His project aims to:</p><ul><li>Reveal how annotation inequality affects protein function prediction systems</li><li>Create ML techniques suited for biological data, which is often noisy, incomplete, and imbalanced &nbsp;</li><li>Integrate data and ML models into a scalable framework to accelerate discoveries involving understudied proteins</li></ul><p>More enduring than the ML framework, Luo will leverage the NSF award to support educational and outreach programs. His goal is to groom the next generation of researchers to study other challenges in computational biology, not just the annotation inequality problem.</p><p>Luo teaches graduate and undergraduate courses focused on computational biology and ML. Problems and methods developed through the CAREER project can be used as course material in his classes.</p><p>Luo also championed collaboration with Georgia Tech’s Center for Education Integrating Science, Mathematics, and Computing (<a href="https://www.ceismc.gatech.edu/">CEISMC</a>) in his proposal.&nbsp;</p><p>Through this partnership, local high school teachers and students would gain access to his data and models. This promotes deeper learning of biology and data science through hands-on experience with real-world tools. &nbsp;</p><p>Luo sees reaching students and the community as a way of paying forward the support he received from Georgia Tech colleagues.&nbsp;</p><p>“I am incredibly grateful for this recognition from the NSF,” said Luo, an assistant professor in the <a href="https://cse.gatech.edu/">School of Computational Science and Engineering</a> (CSE).&nbsp;</p><p>“This would not have been possible without my students and collaborators, whose hard work laid the groundwork for this proposal.”</p><p>Luo praised CSE faculty members <a href="https://faculty.cc.gatech.edu/~badityap/">B. Aditya Prakash</a>, <a href="https://xiuweizhang.wordpress.com/">Xiuwei Zhang</a>, and <a href="http://chaozhang.org/">Chao Zhang</a> for their guidance. All three study <a href="https://cse.gatech.edu/artificial-intelligence-and-machine-learning">machine learning</a> and <a href="https://cse.gatech.edu/computational-bioscience-and-biomedicine">computational bioscience</a>, two of <a href="https://cse.gatech.edu/research">CSE’s five core research areas</a>.&nbsp;</p><p>Luo also thanked <a href="https://faculty.cc.gatech.edu/~hpark/">Haesun Park</a> for her support and recommendation for the CAREER award. Park is a Regents’ Professor and the chair of the School of CSE.</p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1765385842</created>  <gmt_created>2025-12-10 16:57:22</gmt_created>  <changed>1767965851</changed>  <gmt_changed>2026-01-09 13:37:31</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Yunan Luo is the recipient of an NSF Faculty Early Career Development (CAREER) award to use artificial intelligence to solve the protein annotation inequality problem.]]></teaser>  <type>news</type>  <sentence><![CDATA[Yunan Luo is the recipient of an NSF Faculty Early Career Development (CAREER) award to use artificial intelligence to solve the protein annotation inequality problem.]]></sentence>  <summary><![CDATA[<p>Proteins, including antibodies, hemoglobin, and insulin, power nearly every vital aspect of life. Breakthroughs in protein research are producing vaccines, resilient crops, bioenergy sources, and other innovative technologies.</p><p>Despite their importance, most of what scientists know about proteins only comes from a small sample size. This stands in the way of fully understanding how most proteins work and unlocking their full potential.</p><p>Georgia Tech’s <a href="https://faculty.cc.gatech.edu/~yunan/">Yunan Luo</a> believes artificial intelligence (AI) could fill this knowledge gap. The National Science Foundation agrees. Luo is the recipient of an NSF Faculty Early Career Development (<a href="https://www.nsf.gov/funding/opportunities/career-faculty-early-career-development-program">CAREER</a>) award.&nbsp;</p>]]></summary>  <dateline>2025-12-10T00:00:00-05:00</dateline>  <iso_dateline>2025-12-10T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-12-10 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678817</item>          <item>678818</item>      </media>  <hg_media>          <item>          <nid>678817</nid>          <type>image</type>          <title><![CDATA[Yunan-Luo-NSF-CAREER_1.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Yunan-Luo-NSF-CAREER_1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/10/Yunan-Luo-NSF-CAREER_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/10/Yunan-Luo-NSF-CAREER_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/10/Yunan-Luo-NSF-CAREER_1.jpg?itok=La5LFMII]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Yunan Luo NSF CAREER Award]]></image_alt>                    <created>1765385865</created>          <gmt_created>2025-12-10 16:57:45</gmt_created>          <changed>1765385865</changed>          <gmt_changed>2025-12-10 16:57:45</gmt_changed>      </item>          <item>          <nid>678818</nid>          <type>image</type>          <title><![CDATA[Yunan-Luo-NSF-CAREER_2.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Yunan-Luo-NSF-CAREER_2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/10/Yunan-Luo-NSF-CAREER_2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/10/Yunan-Luo-NSF-CAREER_2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/10/Yunan-Luo-NSF-CAREER_2.jpg?itok=ZVW74YH1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Yunan Luo NSF CAREER Award]]></image_alt>                    <created>1765385967</created>          <gmt_created>2025-12-10 16:59:27</gmt_created>          <changed>1765385967</changed>          <gmt_changed>2025-12-10 16:59:27</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/nsf-grant-funds-protein-research-drug-discovery-and-personalized-medicine]]></url>        <title><![CDATA[NSF Grant Funds Protein Research for Drug Discovery and Personalized Medicine]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50877"><![CDATA[School of Computational Science and Engineering]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="166983"><![CDATA[School of Computational Science and Engineering]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="9167"><![CDATA[machine learning]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="362"><![CDATA[National Science Foundation]]></keyword>          <keyword tid="191934"><![CDATA[National Science Foundation (NSF)]]></keyword>          <keyword tid="170447"><![CDATA[Institute for Data Engineering and Science]]></keyword>          <keyword tid="176858"><![CDATA[machine learning center]]></keyword>          <keyword tid="173894"><![CDATA[ML@GT]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686984">  <title><![CDATA[Community and Collaboration Shape the Class of 2025]]></title>  <uid>36319</uid>  <body><![CDATA[<p>Just as it takes a village to raise a child, it takes a community of faculty, mentors, research collaborators, and staff to raise a Georgia Tech graduate.</p><p>The Yellow Jacket community swarmed campus for the final time of the fall semester to celebrate Commencement ceremonies held Dec. 11 to 13. Graduates from the School of Computational Science and Engineering (CSE) were among the 7,177 new alumni “getting out” of Tech. &nbsp; &nbsp;&nbsp;</p><p>“We are immensely proud of School of CSE and CSE programs graduates in the Class of 2025,” said Haesun Park, Regents’ Professor and Chair of the School of CSE.</p><p>“Our collaborative approach to CSE education has prepared these graduates to attain roles in academia, national labs, industry, government, and beyond, where they will lead the next generation of interdisciplinary research.”</p><p>Along with administering its flagship CSE Ph.D. and M.S. CSE programs, the School of CSE offers doctoral degrees in computer science and machine learning. Ph.D. graduates who received their diplomas and doctoral hoods on Dec. 11 at McCamish Pavilion included:</p><ul><li><a href="https://www.linkedin.com/in/grantbruer">Grant Bruer</a> (Ph.D. CSE-CSE 2025), advised by School of CSE Professor and Associate Chair Edmond Chow</li><li><a href="https://www.jinchoi.xyz/">Dongjin Choi</a> (Ph.D. CSE-CSE 2025), advised by School of CSE Regents’ Professor and Chair Haesun Park</li><li><a href="https://ae.gatech.edu/event/2023/06/27/phd-proposal-hyungu-choi">Hyungu Choi</a> (Ph.D. CSE-AE 2025), advised by Daniel Guggenheim School of Aerospace Engineering Regents’ Professor Dimitri Mavris</li><li><strong>Maxfield Comstock</strong> (Ph.D. CSE-CSE 2025), advised by Elizabeth Cherry, College of Computing Associate Dean for Graduate Education and School of CSE Associate Professor</li><li><a href="https://dilab.gatech.edu/andrew-hornback/">Andrew Hornback</a> (Ph.D. CS-CSE 2025), co-advised by School of CSE Assistant Professor Yunan Luo and Wallace H. Coulter Department of Biomedical Engineering Professor May Wang</li><li><a href="https://grad.gatech.edu/events/phd-defense-ayush-jain">Ayush Jain</a> (Ph.D. CSE-MSE 2025), advised by School of Materials Science and Engineering Regents’ Entrepreneur and Professor Rampi Ramprasad</li><li><a href="https://www.linkedin.com/in/anurendk/">Anurendra Kumar</a> (Ph.D. CS-CSE 2025), co-advised by School of CSE J.Z. Liang Early Career Associate Professor Xiuwei Zhang and Wallace H. Coulter Department of Biomedical Engineering Professor Saurabh Sinha</li><li><a href="https://jxie1997.github.io/">Jiajia Xie</a> (Ph.D. CSE-BME 2025), advised by Wallace H. Coulter Department of Biomedical Engineering Associate Professor Cassie Mitchell</li><li><a href="https://night-chen.github.io/">Yuchen Zhuang</a> (Ph.D. ML-CSE 2025), advised by School of CSE Edenfield Early Career Associate Professor Chao Zhang</li><li><a href="https://peterzzq.github.io/">Ziqi Zhang</a> (Ph.D. CSE-CSE 2025), advised by School of CSE J.Z. Liang Early Career Associate Professor Xiuwei Zhang</li></ul><p>Seven CSE Ph.D. students completed M.S. degrees this fall and will continue their studies at Georgia Tech. They are:</p><ul><li><a href="https://www.linkedin.com/in/jesusarias9/">Jesus Arias</a> (M.S. CSE-CSE 2025), advised by School of CSE Assistant Professor Spencer Bryngelson</li><li><a href="https://www.linkedin.com/in/isabel-berry/">Isabel Berry</a> (M.S. CSE-CHEM 2025), advised by Regents’ Professor C. David Sherrill, who is jointly appointed with the School of Chemistry and Biochemistry and the School of CSE</li><li><a href="https://maxhawkins.info/">Max Hawkins</a> (M.S. CSE-CSE 2025), co-advised by School of CSE Professor Rich Vuduc and Assistant Professor Spencer Bryngelson</li><li><a href="https://www.linkedin.com/in/xiao-jing-738641a3/">Xiao Jing</a> (M.S. CSE-AE 2025), advised by Daniel Guggenheim School of Aerospace Engineering Regents’ Professor Dimitri Mavris</li><li><a href="https://haoyunli.wordpress.com/">Haoyun Li</a> (M.S. CSE-CSE 2025), advised by Professor Felix Herrmann, who is jointly appointed with the Schools of Earth and Atmospheric Sciences, Electrical and Computer Engineering, and CSE</li><li><a href="https://www.linkedin.com/in/yuan-qiu-a47404227/">Yuan Qiu</a> (M.S. CSE-CSE 2025), advised by School of CSE Assistant Professor Peng Chen</li><li><a href="https://www.linkedin.com/in/william-schertzer/">William Schertzer</a> (M.S. CSE-MSE 2025), advised by School of Materials Science and Engineering Regents’ Entrepreneur and Professor Rampi Ramprasad</li></ul><p>Georgia Tech’s CSE graduate program includes 12 schools and departments participating as home units. These home units represent the colleges of Computing, Engineering, and Sciences. This approach facilitates an immersive, interdisciplinary experience in which students study computational approaches within domain fields.</p><p>Georgia Tech jointly celebrated master’s graduates at a ceremony on Dec. 13 at Bobby Dodd Stadium. After the Institute celebration, graduates were recognized during ceremonies held by their respective colleges.</p><p>Mawutor Kofi Amanfu (M.S. CSE 2025)</p><p>Sunyoung An (M.S. CSE 2025)</p><p>Nischal Bandi (M.S. CSE 2025)</p><p>Elijah Bellamy (M.S. CSE 2025)</p><p>Meiwen Bi (M.S. CSE 2025)</p><p>Hao-Cheng Chang (M.S. CSE 2025)</p><p>Tianyu Chen (M.S. CSE 2025)</p><p>Yilong Chen (M.S. CSE 2025)</p><p>Zhiyu Chen (M.S. CSE 2025)</p><p>Seung Eun Choi (M.S. CSE 2025)</p><p>Vinodhini Comandur (M.S. CSE 2025)</p><p>Zhiyi Dai (M.S. CSE 2025)</p><p>Alejandro Danies-Lopez (M.S. CSE 2025)</p><p>Zixing Fan (M.S. CSE 2025)</p><p>Stefan Faulkner (M.S. CSE 2025)</p><p>Mihiri Fernando (M.S. CSE 2025)</p><p>Alexandra Freeman (M.S. CSE 2025)</p><p>Yuhan Fu (M.S. CSE 2025)</p><p>Jack Ganem (M.S. CSE 2025)</p><p>Omar Atef Garib (M.S. CSE 2025)</p><p>Martin Graffigna (M.S. CSE 2025)</p><p>Bochun Guo (M.S. CSE 2025)</p><p>Moyi Guo (M.S. CSE 2025)</p><p>Xinyu Guo (M.S. CSE 2025)</p><p>Yuqi Han (M.S. CSE 2025)</p><p>Tianyang Hu (M.S. CSE 2025)</p><p>Mingzheng Huang (M.S. CSE 2025)</p><p>Po-Han Huang (M.S. CSE 2025)</p><p>Wentao Jiang (M.S. CSE 2025)</p><p>Boxiao Jin (M.S. CSE 2025)</p><p>William-Michael Johnson (M.S. CSE 2025)</p><p>Garyoung Lee (M.S. CSE 2025)</p><p>Tzu Jung Lee (M.S. CSE 2025)</p><p>Congyan Li (M.S. CSE 2025)</p><p>Peiru Li (M.S. CSE 2025)</p><p>Yuhan Li (M.S. CSE 2025)</p><p>Zhiyun Liang (M.S. CSE 2025)</p><p>Yuexi Liao (M.S. CSE 2025)</p><p>Chenyu Liu (M.S. CSE 2025)</p><p>Honglin Liu (M.S. CSE 2025)</p><p>Shuojiang Liu (M.S. CSE 2025)</p><p>Xuanzhang Liu (M.S. CSE 2025)</p><p>Yue Lu (M.S. CSE 2025)</p><p>Fang Lunt (M.S. CSE 2025)</p><p>Jinrui Ma (M.S. CSE 2025)</p><p>Yu Miao (M.S. CSE 2025)</p><p>Hui-Chun Mo (M.S. CSE 2025)</p><p>Prajwal Kumar (M.S. CSE 2025)</p><p>Kavya Krishnan (M.S. CSE 2025)</p><p>Felicity Nielson (M.S. CSE 2025)</p><p>Jonathan Perng (M.S. CSE 2025)</p><p>Yinzhu Quan (M.S. CSE 2025)</p><p>Devanshi Shah (M.S. CSE 2025)</p><p>Yuxuan Shen (M.S. CSE 2025)</p><p>Steven Stewart (M.S. CSE 2025)</p><p>Linjun Su (M.S. CSE 2025)</p><p>Jingyun Sun (M.S. CSE 2025)</p><p>Abdul Rehman Tariq (M.S. CSE 2025)</p><p>Yu Chu Tsai (M.S. CSE 2025)</p><p>Xunzhi Wen (M.S. CSE 2025)</p><p>Jinghua Weng (M.S. CSE 2025)</p><p>Andi Xia (M.S. CSE 2025)</p><p>Zihao Xiao (M.S. CSE 2025)</p><p>Yunxiang Yan (M.S. CSE 2025)</p><p>Ziyuan Ye (M.S. CSE 2025)</p><p>Linyuan Yu (M.S. CSE 2025)</p><p>Bingqing Zhang (M.S. CSE 2025)</p><p>Tiankuo Zhang (M.S. CSE 2025)</p><p>Yu Zheng (M.S. CSE 2025)</p><p>Boye Zhou (M.S. CSE 2025)</p><p>Xinjie Zhu (M.S. CSE 2025)</p><p>Zilu Zhu (M.S. CSE 2025)</p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1766069802</created>  <gmt_created>2025-12-18 14:56:42</gmt_created>  <changed>1766069855</changed>  <gmt_changed>2025-12-18 14:57:35</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The Yellow Jacket community swarmed campus for the final time of the fall semester to celebrate Commencement ceremonies held Dec. 11 to 13. Graduates from the School of Computational Science and Engineering (CSE) were among the 7,177 new alumni “getting o]]></teaser>  <type>news</type>  <sentence><![CDATA[The Yellow Jacket community swarmed campus for the final time of the fall semester to celebrate Commencement ceremonies held Dec. 11 to 13. Graduates from the School of Computational Science and Engineering (CSE) were among the 7,177 new alumni “getting o]]></sentence>  <summary><![CDATA[<p>Just as it takes a village to raise a child, it takes a community of faculty, mentors, research collaborators, and staff to raise a Georgia Tech graduate.</p><p>The Yellow Jacket community swarmed campus for the final time of the fall semester to celebrate Commencement ceremonies held Dec. 11 to 13. Graduates from the School of Computational Science and Engineering (CSE) were among the 7,177 new alumni “getting out” of Tech. &nbsp; &nbsp;&nbsp;</p>]]></summary>  <dateline>2025-12-18T00:00:00-05:00</dateline>  <iso_dateline>2025-12-18T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-12-18 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678889</item>      </media>  <hg_media>          <item>          <nid>678889</nid>          <type>image</type>          <title><![CDATA[Fall-2025-Masters-Commencement.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Fall-2025-Masters-Commencement.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/18/Fall-2025-Masters-Commencement.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/18/Fall-2025-Masters-Commencement.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/18/Fall-2025-Masters-Commencement.jpg?itok=I1BlTgvW]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Fall 2025 College of Computing Masters Commencement]]></image_alt>                    <created>1766069812</created>          <gmt_created>2025-12-18 14:56:52</gmt_created>          <changed>1766069812</changed>          <gmt_changed>2025-12-18 14:56:52</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/community-and-collaboration-shape-class-2025]]></url>        <title><![CDATA[Community and Collaboration Shape the Class of 2025]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50877"><![CDATA[School of Computational Science and Engineering]]></group>      </groups>  <categories>          <category tid="130"><![CDATA[Alumni]]></category>          <category tid="42901"><![CDATA[Community]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="129"><![CDATA[Institute and Campus]]></category>      </categories>  <news_terms>          <term tid="130"><![CDATA[Alumni]]></term>          <term tid="42901"><![CDATA[Community]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="129"><![CDATA[Institute and Campus]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="166983"><![CDATA[School of Computational Science and Engineering]]></keyword>      </keywords>  <core_research_areas>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686905">  <title><![CDATA[Georgia Tech Researchers Make Waves at the World’s Largest Neuroscience Conference]]></title>  <uid>35575</uid>  <body><![CDATA[<div><p>Imagine stepping into a space the size of multiple football fields — only instead of turf and goalposts, it’s filled with science. Every inch is alive with posters, equipment demos, and researchers sharing the latest breakthroughs.&nbsp;&nbsp;</p></div><div><p>Welcome to the Society for Neuroscience (SfN) Conference, one of the largest scientific gatherings in the world, drawing more than 30,000 attendees to San Diego in November. According to <a href="https://neuro.gatech.edu/user/1105" rel="noreferrer noopener" target="_blank">Annabelle Singer</a>, it is <em>the</em> place to be for neuroscientists. “If you want to know what is going on now in neuroscience, it is being talked about at SfN.”&nbsp;</p></div><div><p>Singer is a McCamish Foundation Early Career Professor in the Wallace H. <a href="https://bme.gatech.edu/" rel="noreferrer noopener" target="_blank">Coulter Department of Biomedical Engineering</a> (BME) at Georgia Tech and Emory University. A frequent SfN attendee, she describes the meeting as “Dragon Con for neuroscience, with thousands of talks and posters going on simultaneously.”&nbsp;</p></div><div><p>This year, Georgia Tech didn’t just show up — it made a statement with more than <a href="https://public.tableau.com/views/Neuroscience2025/main?:showVizHome=no" rel="noreferrer noopener" target="_blank">60 presentations</a>, a major outreach award, and a spotlight press conference.&nbsp;</p></div><div><p>“Seeing Georgia Tech and INNS represented so strongly at SfN is exciting,” says <a href="https://ece.gatech.edu/directory/christopher-john-rozell" rel="noreferrer noopener" target="_blank">Chris Rozell</a>, executive director of Tech’s <a href="https://neuro.gatech.edu/" rel="noreferrer noopener" target="_blank">Institute for Neuroscience, Neurotechnology, and Society</a> (INNS). “It reflects the incredible breadth of neuroscience and neurotechnology research happening across our campus and how our work is shaping conversations at the highest level.”&nbsp;</p></div><div><h3><strong>Inside ‘Neuroscience Dragon Con’</strong>&nbsp;</h3></div><div><p>Many conferences center around structured lectures, but at SfN, posters are the heart. You might find a senior researcher presenting groundbreaking findings right next to a first-time attendee sharing early results. This diversity is what makes the experience so valuable, says Singer. “Trainees get to talk directly with the scientist doing the work to get their questions answered, from wondering about future implications to clarifying technical details.”&nbsp;</p></div><div><p>The scale of SfN can feel overwhelming, but for many, that’s part of the excitement. “There are so many different posters from so many different fields. It’s a lot to absorb, but it’s all very interesting,” said Benjamin Magondu, a biomedical engineering Ph.D. student presenting for the first time. “I’ve definitely learned at least 47 things by just walking 10 feet.”&nbsp;</p></div><div><p>For students like Magondu, the experience is critical, says <a href="https://biosciences.gatech.edu/" rel="noreferrer noopener" target="_blank">Biological Sciences</a> Assistant Professor <a href="https://biosciences.gatech.edu/people/farzaneh-najafi" rel="noreferrer noopener" target="_blank">Farzaneh Najafi</a>. “SfN has such a big scope, all the way from molecular to cognitive and computational systems. Especially for those deciding which direction of neuroscience they want to go into, it’s invaluable.”&nbsp;</p></div><div><p>That breadth also fosters connections across disciplines. “Conferences are usually pretty niche,” noted Tina Franklin, a research scientist in BME. “You have your own field that you’re really good at, but it’s difficult to venture out and find new people who can help you figure out what comes next. This conference brings people from all different fields together with the common interest of neuroscience and brain research.”&nbsp;</p></div><div><h3><strong>Leading the Charge</strong>&nbsp;</h3></div><div><p>Georgia Tech’s impact went beyond the conference floor. <a href="https://research.gatech.edu/people/ming-fai-fong" rel="noreferrer noopener" target="_blank">Ming-fai Fong</a>, an assistant professor in BME, received the prestigious Next Generation Award, one of SfN’s <a href="https://www.sfn.org/publications/latest-news/2025/11/03/society-for-neuroscience-2025-education-and-outreach-awards" rel="noreferrer noopener" target="_blank">education and outreach awards</a>. The honor recognizes members who make outstanding contributions to public communication and education about neuroscience.&nbsp;&nbsp;</p></div><div><p>“I’m certainly very grateful to the Society for Neuroscience for recognizing these types of contributions,” says Fong, who was recognized for her work supporting blind and visually impaired youth in Atlanta. “Rewarding outreach efforts reinforces my core belief that scientists and engineers can make an immediate impact on communities we care about through outreach. It’s a great parallel avenue to making a positive impact through research.”&nbsp;</p></div><div><p>Building on this recognition, Georgia Tech was in the spotlight during one of SfN’s selective press conferences — a session on <a href="https://www.the-scientist.com/ai-tools-unravel-thoughts-actions-and-neuronal-makeup-73779" rel="noreferrer noopener" target="_blank">artificial intelligence in neuroscience</a> moderated by Rozell, who is also the Julian T. Hightower Chair in the <a href="https://ece.gatech.edu/" rel="noreferrer noopener" target="_blank">School of Electrical and Computer Engineering</a>.&nbsp;</p></div><div><p>During the SfN press event, <a href="https://med.emory.edu/directory/profile/?u=TKESAR" rel="noreferrer noopener" target="_blank">Trisha Kesar,</a> an associate professor in BME and adjunct faculty in the School of Biological Sciences, presented her research using AI to improve gait rehabilitation. Her work was among just 40 abstracts selected from more than 10,000 submissions for this honor, and one of five abstracts selected for the AI in neuroscience press conference. The project is a collaboration with <a href="https://bme.gatech.edu/bio/hyeokhyen-kwon" rel="noreferrer noopener" target="_blank">Hyeok Kwon</a>, a Georgia Tech computer science alumnus and an assistant professor in BME.&nbsp;</p></div><div><p>“It’s exciting to see Georgia Tech and Atlanta emerging as hubs for neuroscience innovation,” said Kesar. “Being part of a press conference on AI in neuroscience shows how much our community is contributing to the future of brain research, and how collaboration across institutions can accelerate progress.”&nbsp;</p></div>]]></body>  <author>adavidson38</author>  <status>1</status>  <created>1765902318</created>  <gmt_created>2025-12-16 16:25:18</gmt_created>  <changed>1765917246</changed>  <gmt_changed>2025-12-16 20:34:06</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[With more than 60 presentations and recognition for neuroscience outreach and AI research, Georgia Tech demonstrated its growing impact at the 2025 Society for Neuroscience’s annual meeting.]]></teaser>  <type>news</type>  <sentence><![CDATA[With more than 60 presentations and recognition for neuroscience outreach and AI research, Georgia Tech demonstrated its growing impact at the 2025 Society for Neuroscience’s annual meeting.]]></sentence>  <summary><![CDATA[<p>With more than 60 presentations and recognition for neuroscience outreach and AI research, Georgia Tech demonstrated its growing impact at the 2025 Society for Neuroscience’s annual meeting.</p>]]></summary>  <dateline>2025-12-16T00:00:00-05:00</dateline>  <iso_dateline>2025-12-16T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-12-16 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[audra.davidson@research.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Writer and media contact:</strong><br><a href="mailto:audra.davidson@research.gatech.edu">Audra Davidson</a><br>Research Communications Manager<br>Institute for Neuroscience, Neurotechnology, and Society (INNS)</p><p><strong>Presenter Dashboard:</strong><br>Created by <a href="mailto:jpreston7@gatech.edu">Joshua Preston</a>, Communications Manager, College of Computing<br>Data collection by Audra Davidson, Hunter Ashcraft</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678854</item>          <item>678856</item>          <item>678855</item>          <item>678857</item>      </media>  <hg_media>          <item>          <nid>678854</nid>          <type>image</type>          <title><![CDATA[1763342998142_viaSfN.jpeg]]></title>          <body><![CDATA[<p>Affectionally called "DragonCon for neuroscience," the annual Society for Neuroscience meeting is one of the largest academic conferences in the world.</p>]]></body>                      <image_name><![CDATA[1763342998142_viaSfN.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/16/1763342998142_viaSfN.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/16/1763342998142_viaSfN.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/16/1763342998142_viaSfN.jpeg?itok=sv-n4A7F]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Affectionally called "DragonCon for neuroscience," the annual Society for Neuroscience meeting is one of the largest academic conferences in the world.]]></image_alt>                    <created>1765903757</created>          <gmt_created>2025-12-16 16:49:17</gmt_created>          <changed>1765903757</changed>          <gmt_changed>2025-12-16 16:49:17</gmt_changed>      </item>          <item>          <nid>678856</nid>          <type>image</type>          <title><![CDATA[IMG_6535-2.png]]></title>          <body><![CDATA[<p>Benjamin Magondu, a graduate student in biomedical engineering, presented at SfN for the first time this year.</p>]]></body>                      <image_name><![CDATA[IMG_6535-2.png]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/16/IMG_6535-2.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/16/IMG_6535-2.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/16/IMG_6535-2.png?itok=gQ7LIvDV]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Benjamin Magondu, a graduate student in biomedical engineering, presented at SfN for the first time this year.]]></image_alt>                    <created>1765903975</created>          <gmt_created>2025-12-16 16:52:55</gmt_created>          <changed>1765903975</changed>          <gmt_changed>2025-12-16 16:52:55</gmt_changed>      </item>          <item>          <nid>678855</nid>          <type>image</type>          <title><![CDATA[IMG_6838.png]]></title>          <body><![CDATA[<p>With hundreds of presentations happening simultaneously, the poster floor can be overwhelming at SfN — but for many, that's part of the draw.</p>]]></body>                      <image_name><![CDATA[IMG_6838.png]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/16/IMG_6838.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/16/IMG_6838.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/16/IMG_6838.png?itok=twXTeCI_]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[With hundreds of presentations happening simultaneously, the poster floor can be overwhelming at SfN — but for many, that's part of the draw.]]></image_alt>                    <created>1765903880</created>          <gmt_created>2025-12-16 16:51:20</gmt_created>          <changed>1765903880</changed>          <gmt_changed>2025-12-16 16:51:20</gmt_changed>      </item>          <item>          <nid>678857</nid>          <type>image</type>          <title><![CDATA[IMG_6748-2.png]]></title>          <body><![CDATA[<p>Trisha Kesar answers a question during the SfN press conference on AI in neuroscience, moderated by Chris Rozell.</p>]]></body>                      <image_name><![CDATA[IMG_6748-2.png]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/16/IMG_6748-2.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/16/IMG_6748-2.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/16/IMG_6748-2.png?itok=GGKYaHzb]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Trisha Kesar answers a question during the SfN press conference on AI in neuroscience, moderated by Chris Rozell.]]></image_alt>                    <created>1765904071</created>          <gmt_created>2025-12-16 16:54:31</gmt_created>          <changed>1765904071</changed>          <gmt_changed>2025-12-16 16:54:31</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://neuro.gatech.edu/georgia-tech-uses-computing-and-engineering-methods-shift-neuroscience-paradigms]]></url>        <title><![CDATA[Georgia Tech Uses Computing and Engineering Methods to Shift Neuroscience Paradigms]]></title>      </link>          <link>        <url><![CDATA[https://www.the-scientist.com/ai-tools-unravel-thoughts-actions-and-neuronal-makeup-73779]]></url>        <title><![CDATA[Inside the SfN Press Conference: AI Tools Unravel Thoughts, Actions, and Neuronal Makeup]]></title>      </link>          <link>        <url><![CDATA[https://neuro.gatech.edu/head-toe-georgia-tech-researchers-treat-entire-human-body-through-neuroscience-research]]></url>        <title><![CDATA[Head to Toe: Georgia Tech Researchers Treat the Entire Human Body Through Neuroscience Research]]></title>      </link>          <link>        <url><![CDATA[https://www.flickr.com/photos/202927865@N06/albums/72177720330951882/]]></url>        <title><![CDATA[Georgia Tech at SfN in Photos]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="66220"><![CDATA[Neuro]]></group>          <group id="1292"><![CDATA[Parker H. Petit Institute for Bioengineering and Bioscience (IBB)]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="1275"><![CDATA[School of Biological Sciences]]></group>          <group id="443951"><![CDATA[School of Psychology]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="172970"><![CDATA[go-neuro]]></keyword>          <keyword tid="187423"><![CDATA[go-bio]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="193656"><![CDATA[Neuro Next Initiative]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686652">  <title><![CDATA[Record-Breaking Simulation Boosts Rocket Science and Supercomputing to New Limits]]></title>  <uid>36319</uid>  <body><![CDATA[<p>Spaceflight is becoming safer, more frequent, and more sustainable thanks to the largest computational fluid flow simulation ever ran on Earth.</p><p>Inspired by SpaceX’s Super Heavy booster, a team led by Georgia Tech’s&nbsp;<a href="https://comp-physics.group/"><strong>Spencer Bryngelson</strong></a> and New York University’s <strong>Florian Schäfer</strong> modeled the turbulent interactions of a 33-engine rocket. Their experiment set new records, running the largest ever fluid dynamics simulation by a factor of 20 and the fastest by over a factor of four.</p><p>The team ran its custom software on the world’s two fastest supercomputers, as well as the eighth fastest, to construct such a massive model.</p><p>Applications from the simulation reach beyond rocket science. The same computing methods can model fluid mechanics in aerospace, medicine, energy, and other fields. At the same time, the work advances understanding of the current limits and future potential of computing.&nbsp;</p><p>The team finished as runners-up for the 2025 Gordon Bell Prize for its impactful, multi-domain research. Referred to as the Nobel Prize of supercomputing, the award was presented at the world’s top conference for high-performance computing (HPC) research.</p><p>“Fluid dynamics problems of this style, with shocks, turbulence, different interacting fluids, and so on, are a scientific mainstay that marshals our largest supercomputers,” said Bryngelson, an assistant professor with the School of Computational Science and Engineering (CSE).</p><p>“Larger and faster simulations that enable solutions to long-standing scientific problems, like the rocket propulsion problem, are always needed. With our work, perhaps we took a big dent out of that issue.”</p><p>The Super Heavy booster reflects the space industry’s move toward reusable multi-engine first-stage rockets that are easier to transport and more economical overall.&nbsp;</p><p>However, this shift creates research and testing challenges for new designs.</p><p>Each of Super Heavy’s 33 thrusters expels propellant at ten times the speed of sound. As individual engines reach extreme temperatures, pressures, and densities, their combined interactions with the airframe make such violent physics even more unpredictable.</p><p>Frequent physical experiments would be expensive and risky, so scientists rely on computer models to supplement the engineering process.&nbsp;</p><p>Bryngelson’s flagship&nbsp;<a href="https://mflowcode.github.io/">Multicomponent Flow Code (MFC)</a> software anchored the experiment. MFC is an open-source computer program that simulates fluid dynamic models. Bryngelson’s lab has been modifying MFC since 2022 to run on more powerful computers and solve larger problems.&nbsp;</p><p>In computing terms, this MFC-enhanced model simulated fluid flow resolution at 200 trillion grid points and one quadrillion degrees of freedom. These metrics exceeded previous record-setting benchmarks that tallied 10 trillion and 30 trillion grid points.</p><p>This means MFC simulations provide greater detail and capture smaller-scale features than previous approaches. The rocket simulation also ran four times faster and achieved 5.7 times the energy efficiency of comparable methods.&nbsp; &nbsp;</p><p>Integrating&nbsp;<a href="https://arxiv.org/abs/2505.07392">information geometric regularization (IGR)</a> into MFC played a key role in attaining these results. This new approach improved the simulation’s computational efficiency and overcame the challenge of shock dynamics.</p><p>In fluid mechanics, shock waves occur when objects move faster than the speed of sound. Along with hampering the performance of airframes and propulsion systems, shocks have historically been difficult to simulate.</p><p>Computational scientists have used empirical models based on artificial viscosity to account for shocks. Although these approaches mimic the physical effects of shock waves at the microscopic scale, they struggle to effectively capture the large-scale features of the flow.&nbsp;</p><p>Information geometry uses curved spaces to study concepts of statistics and information. IGR uses these tools to modify the underlying geometry in fluid dynamics equations. When traveling in the modified geometry, fluid in the model preserves the shocks in a more natural way.&nbsp;</p><p>“When regularizing shocks to much larger scales relevant in these numerical simulations, conventional methods smear out important fine-scale details,” said Schäfer, an assistant professor at NYU’s Courant Institute of Mathematical Sciences.</p><p>“IGR introduces ideas from abstract math to CFD that allow creating modified paths that approach the singularity without ever reaching it. In the resulting fluid flow, shocks never become too spiky in simulations, but the fine-scale details do not smear out either.”&nbsp;</p><p>Simulating a model this large required the Georgia Tech researchers to run MFC on El Capitan and Frontier, the world's two fastest supercomputers.&nbsp;</p><p>The systems are two of four exascale machines in existence. This means they can solve at least one quintillion (“1” followed by 18 zeros) calculations per second. If a person completed a simple math calculation every second, it would take that person about 30 billion years to reach one quintillion operations.</p><p>Frontier is housed at Oak Ridge National Laboratory and debuted as the world’s first exascale supercomputer in 2022. El Capitan surpassed Frontier when Lawrence Livermore National Laboratory launched it in 2024.</p><p>To prepare MFC for performance on these machines, Bryngelson’s lab followed a methodical approach spanning years of hardware acquisition and software engineering.&nbsp;</p><p>In 2022,&nbsp;<a href="https://www.cc.gatech.edu/news/new-hardware-brings-students-closer-exascale-computing">Bryngelson attained an AMD MI210 GPU accelerator</a>. Optimizing MFC on the component played a critical step toward preparing the software for exascale machines.</p><p>AMD hardware underpins both El Capitan and Frontier. The MI300A GPU powers El Capitan while Frontier uses the MI250X GPU.&nbsp;</p><p>After configuring MFC on the MI210 GPU,&nbsp;<a href="https://www.cc.gatech.edu/news/group-optimizes-fluid-dynamics-simulator-worlds-fastest-supercomputer">Bryngelson’s lab ran the software on Frontier for the first time during a 2023 hackathon</a>. This confirmed the code was ready for full-scale deployment on exascale supercomputers based on AMD hardware.&nbsp;</p><p>In addition to El Capitan and Frontier, the simulation ran on Alps, the world’s eight-fastest supercomputer based at the Swiss National Supercomputing Centre. It is the largest available system that features the NVIDIA GH200 Grace Hopper Superchip.</p><p>Like with AMD GPUs,&nbsp;<a href="https://www.cc.gatech.edu/news/researchers-blazing-new-trails-superchip-named-after-computing-pioneer">Bryngelson acquired four GH200s in 2024</a> and began configuring MFC to the latest hardware innovation powering New Age supercomputers. Later that year, the Jülich Research Centre accepted Bryngelson’s group into an early access program to test JUPITER, a developing supercomputer based on the NVIDIA superchip.</p><p><a href="https://www.cc.gatech.edu/news/pancaked-water-droplets-help-launch-europes-fastest-supercomputer">The group earned a certificate for scaling efficiency and node performance</a> on the way toward validating that their code worked on the GH200. The early access project proved successful for JUPITER, which launched in 2025 as Europe’s fastest supercomputer and fourth fastest in the world.</p><p>“Getting the level of hands-on experience with world-leading supercomputers and computing resources at Georgia Tech through this project has been a fantastic opportunity for a grad student,” said CSE Ph.D. student <strong>Ben Wilfong</strong>.</p><p>“To leverage these machines, I learned more advanced programming techniques that I’m glad to have in my tool belt for future projects. I also enjoyed the opportunity to work closely with and learn from industry experts from NVIDIA, AMD, and HPE/Cray.”</p><p>El Capitan, Frontier, JUPITER, and Alps maintained their rankings at the 2025 International Conference for High Performance Computing Networking, Storage and Analysis (<a href="https://sc25.supercomputing.org/">SC25</a>). Of note, the TOP500 announced at SC25 that JUPITER surpassed the exaflop threshold.&nbsp;</p><p>The SC Conference Series is one of two venues where the&nbsp;<a href="https://top500.org/">TOP500</a> announces updated supercomputer rankings every June and November. The TOP500 ranks and details the 500 most powerful supercomputers in the world.&nbsp;</p><p>The SC Conference Series serves as the venue where the&nbsp;<a href="https://www.acm.org/media-center/2025/november/gordon-bell-climate-2025">Association for Computing Machinery (ACM) presents the Gordon Bell Prize</a>. The annual award recognizes achievement in HPC research and application. The Tech-led team was among eight finalists for this year’s award.</p><p>Along with Bryngelson, Georgia Tech members included Ph.D. students <strong>Anand Radhakrishnan</strong> and Wilfong, postdoctoral researcher <strong>Daniel Vickers</strong>, alumnus <strong>Henry Le Berre</strong> (CS 2025), and undergraduate student <strong>Tanush Prathi</strong>.</p><p>Schäfer’s partnership with the group stems from his previous role as an assistant professor at Georgia Tech from 2021 to 2025.&nbsp;</p><p>Collaborators on the project included <strong>Nikolaos Tselepidis</strong> and <strong>Benedikt Dorschner</strong> from NVIDIA, <strong>Reuben Budiardja</strong> from ORNL, <strong>Brian Cornille</strong> from AMD, and <strong>Stephen Abbot</strong> from HPE. All were co-authors of the paper and named finalists for the Gordon Bell Prize.&nbsp;</p><p>“I’m elated that we have been nominated for such a prestigious award. It wouldn't have been possible without the combined and diligent efforts of our team,” Radhakrishnan said.&nbsp;</p><p>“I’m looking forward to presenting our work at SC25 and connecting with other researchers and fellow finalists while showcasing seminal work in the field of computing.”</p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1764605272</created>  <gmt_created>2025-12-01 16:07:52</gmt_created>  <changed>1765225799</changed>  <gmt_changed>2025-12-08 20:29:59</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Inspired by SpaceX’s Super Heavy booster, a team led by Georgia Tech’s Spencer Bryngelson and New York University’s Florian Schäfer modeled the turbulent interactions of a 33-engine rocket. Their experiment set new records, running the largest ever fluid ]]></teaser>  <type>news</type>  <sentence><![CDATA[Inspired by SpaceX’s Super Heavy booster, a team led by Georgia Tech’s Spencer Bryngelson and New York University’s Florian Schäfer modeled the turbulent interactions of a 33-engine rocket. Their experiment set new records, running the largest ever fluid ]]></sentence>  <summary><![CDATA[<p>Spaceflight is becoming safer, more frequent, and more sustainable thanks to the largest computational fluid flow simulation ever ran on Earth.</p><p>Inspired by SpaceX’s Super Heavy booster, a team led by Georgia Tech’s&nbsp;<a href="https://comp-physics.group/">Spencer Bryngelson</a> and New York University’s <strong>Florian Schäfer</strong> modeled the turbulent interactions of a 33-engine rocket. Their experiment set new records, running the largest ever fluid dynamics simulation by a factor of 20 and the fastest by a factor of over four.</p><p>To construct such a massive model, the custom software ran on the world’s two fastest supercomputers, as well as the eighth fastest.</p><p>The team finished as runners-up for the 2025 Gordon Bell Prize for its impactful, multi-domain research. Referred to as the Nobel Prize of supercomputing, the award was presented at the world’s top conference for high-performance computing (HPC) research.</p>]]></summary>  <dateline>2025-12-01T00:00:00-05:00</dateline>  <iso_dateline>2025-12-01T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-12-01 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678734</item>          <item>678735</item>          <item>678736</item>      </media>  <hg_media>          <item>          <nid>678734</nid>          <type>image</type>          <title><![CDATA[SpaceX-Super-Heavy2.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[SpaceX-Super-Heavy2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/01/SpaceX-Super-Heavy2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/01/SpaceX-Super-Heavy2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/01/SpaceX-Super-Heavy2.jpg?itok=rvXZMixz]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[2025 Gordon Bell Prize Rocket Simulation]]></image_alt>                    <created>1764605279</created>          <gmt_created>2025-12-01 16:07:59</gmt_created>          <changed>1764605279</changed>          <gmt_changed>2025-12-01 16:07:59</gmt_changed>      </item>          <item>          <nid>678735</nid>          <type>image</type>          <title><![CDATA[SHB-and-FS_SC25.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[SHB-and-FS_SC25.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/01/SHB-and-FS_SC25.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/01/SHB-and-FS_SC25.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/01/SHB-and-FS_SC25.jpg?itok=vnIVzoYD]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Spencer Bryngelson and Florian Schäfer at SC25]]></image_alt>                    <created>1764605349</created>          <gmt_created>2025-12-01 16:09:09</gmt_created>          <changed>1764605349</changed>          <gmt_changed>2025-12-01 16:09:09</gmt_changed>      </item>          <item>          <nid>678736</nid>          <type>image</type>          <title><![CDATA[Frontier-Hackathon.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Frontier-Hackathon.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/01/Frontier-Hackathon.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/01/Frontier-Hackathon.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/01/Frontier-Hackathon.jpg?itok=6tsOhI_m]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Spencer Bryngelson Frontier Hackathon]]></image_alt>                    <created>1764605398</created>          <gmt_created>2025-12-01 16:09:58</gmt_created>          <changed>1764605398</changed>          <gmt_changed>2025-12-01 16:09:58</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/record-breaking-simulation-boosts-rocket-science-and-supercomputing-new-limits]]></url>        <title><![CDATA[Record-Breaking Simulation Boosts Rocket Science and Supercomputing to New Limits]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="134"><![CDATA[Student and Faculty]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="166983"><![CDATA[School of Computational Science and Engineering]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="3427"><![CDATA[High performance computing]]></keyword>          <keyword tid="168929"><![CDATA[supercomputers]]></keyword>          <keyword tid="2082"><![CDATA[aerospace engineering]]></keyword>          <keyword tid="190596"><![CDATA[space research]]></keyword>          <keyword tid="167880"><![CDATA[SpaceX]]></keyword>      </keywords>  <core_research_areas>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="193657"><![CDATA[Space Research Initiative]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686720">  <title><![CDATA[What if Hospitals Could Automatically Protect Patients from Cyber Threats?]]></title>  <uid>36253</uid>  <body><![CDATA[<p>A software update was missed for the program running your local hospital’s X-ray machines. A hacker now controls all the machines and is demanding $500,000 in cryptocurrency be sent to an anonymous wallet; otherwise, he will shut down the entire radiology department.</p><p>This scenario becomes more likely for hospitals of all sizes as medical technology advances, adding more devices to constantly growing networks.</p><p>With the help of a contract award for up to $12 million from the Advanced Research Projects Agency for Health (ARPA-H) <a href="https://arpa-h.gov/explore-funding/programs/upgrade">UPGRADE</a> program, a team of researchers led by the School of Cybersecurity and Privacy at Georgia Tech will begin developing an advanced cybersecurity platform to help hospitals proactively identify and fix vulnerabilities in their software, devices, and networks.&nbsp;</p><p>“This is a new area of security research,” said Associate Professor <strong>Brendan Saltaformaggio</strong>. “We not only have to worry about the cybersecurity aspect, but the physical security as well. Our research must be very accurate to make sure patients are safe from cyberthreats.”&nbsp;</p><p>Starting next month, the team of researchers on the Hospital-Integrated Vulnerability Identification and Proactive Remediation (H-VIPER) project will begin developing a system they are calling the Whole-Hospital Simulation (WHS).</p><p>The system maps out the online network for hospitals of all sizes and enables IT teams to test their cyber capabilities before going live. The system can also identify threats, such as missed software updates, and alert the IT department.</p><p>“Hospitals have thousands of devices connected to their networks, including medical devices,” said Saltaformaggio. “A hospital like Children’s has a huge attack surface. A smaller hospital might have different challenges, but possible entry points are still there.”</p><p>The team has already interviewed IT teams at Children’s Healthcare of Atlanta and Hamilton Health Care System. Their findings have provided them with a better understanding of how to scale the WHS system to meet each hospital’s specific needs.</p><p>“Hospitals IT processes are notoriously sensitive to disruption, because essentially any kind of down time for rebooting a system or lack of availability can create chaos in the clinical environment,” said <strong>Stoddard Manikin</strong>, chief information security officer for Children’s Healthcare of Atlanta.</p><p>“Our goal is to create very smooth processes and workflow for our patient facing staff and providers to deliver the best care possible. This research opportunity gives us a chance to develop news ways where we can look at these sensitive medical devices and things on the IT network in a healthcare environment and potentially remediate vulnerabilities without taking them out of service.”&nbsp;</p><p>Saltaformaggio and his colleagues found that, regardless of size, security remains retroactive and not proactive. By leveraging their diverse expertise, the research team will ensure that the H-VIPER project addresses vulnerabilities at every layer of hospital technology, from the network to the hardware.&nbsp;</p><p>The <a href="https://scp.cc.gatech.edu/">School of Cybersecurity and Privacy</a> will lead this initiative, with faculty from the H-VIPER project also representing the <a href="https://www.cc.gatech.edu/">College of Computing</a>, the <a href="https://coe.gatech.edu/">College of Engineering</a>, the <a href="https://ece.gatech.edu/">School of Electrical and Computer Engineering</a>, the <a href="https://www.scs.gatech.edu/">School of Computer Science</a>, and the <a href="https://gtri.gatech.edu/">Georgia Tech Research Institute</a>, along with support from their Ph.D. students and postdoctoral researchers.&nbsp;</p><p>Around 30 Georgia Tech researchers will partner with <a href="https://www.emory.edu/home/index.html">Emory University</a>, <a href="https://www.choa.org">Children’s Healthcare of Atlanta</a>, <a href="https://vitruvianhealth.com/locations/hamilton-medical-center/">Hamilton Health Care System</a>, <a href="https://www.tufts.edu/">Tufts University</a>, <a href="https://www.iastate.edu/">Iowa State University</a>, and <a href="https://narfindustries.com/">Narf Industries</a>.&nbsp;</p><p>Georgia Tech faculty working on the project are:</p><ul><li>Associate Professor <strong>Brendan Saltaformaggio</strong></li><li>Regents’ Professor <strong>Wenke Lee</strong></li><li>Professor <strong>Taesoo Kim</strong></li><li>Professor <strong>Fabian Monrose</strong></li><li>Assistant Professor <strong>Frank Li</strong></li><li>Associate Professor <strong>Saman Zonouz</strong></li><li>Associate Professor<strong> Daniel Genkin</strong></li><li>Research Professor <strong>Sukarno Mertoguno</strong></li><li>Senior Research Scientist <strong>Trevor Lewis</strong> &nbsp;</li></ul>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1764776975</created>  <gmt_created>2025-12-03 15:49:35</gmt_created>  <changed>1765213725</changed>  <gmt_changed>2025-12-08 17:08:45</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[With the help of a contract award for up to $12 million from ARPA-H, a team of researchers led by the School of Cybersecurity and Privacy at will begin developing an advanced cybersecurity platform to protect hospitals. ]]></teaser>  <type>news</type>  <sentence><![CDATA[With the help of a contract award for up to $12 million from ARPA-H, a team of researchers led by the School of Cybersecurity and Privacy at will begin developing an advanced cybersecurity platform to protect hospitals. ]]></sentence>  <summary><![CDATA[<p>With the help of a contract award for up to $12 million from the Advanced Research Projects Agency for Health (ARPA-H), a team of researchers led by the School of Cybersecurity and Privacy at Georgia Tech will begin developing an advanced cybersecurity platform to help hospitals proactively identify and fix vulnerabilities in their software, devices, and networks.&nbsp;</p>]]></summary>  <dateline>2025-12-03T00:00:00-05:00</dateline>  <iso_dateline>2025-12-03T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-12-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jpopham3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Popham&nbsp;Communications Officer II | School of Cybersecurity and Privacy</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678753</item>      </media>  <hg_media>          <item>          <nid>678753</nid>          <type>image</type>          <title><![CDATA[Cyfi-Lab-Brendan.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Cyfi-No-Dict-1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/03/Cyfi-No-Dict-1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/03/Cyfi-No-Dict-1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/03/Cyfi-No-Dict-1.jpg?itok=4G7fie_e]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A man points to a rack of computer monitors. Another man sits in front of a laptop with his back to the camera. ]]></image_alt>                    <created>1764777096</created>          <gmt_created>2025-12-03 15:51:36</gmt_created>          <changed>1764777096</changed>          <gmt_changed>2025-12-03 15:51:36</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="365"><![CDATA[Research]]></keyword>          <keyword tid="193109"><![CDATA[arpa-h]]></keyword>          <keyword tid="2634"><![CDATA[grant]]></keyword>          <keyword tid="127901"><![CDATA[Contract]]></keyword>          <keyword tid="1404"><![CDATA[Cybersecurity]]></keyword>          <keyword tid="344"><![CDATA[cyber]]></keyword>          <keyword tid="3532"><![CDATA[impact]]></keyword>          <keyword tid="4499"><![CDATA[hospitals]]></keyword>          <keyword tid="179869"><![CDATA[partners]]></keyword>          <keyword tid="340"><![CDATA[collaboration]]></keyword>          <keyword tid="1129"><![CDATA[healthcare]]></keyword>          <keyword tid="194701"><![CDATA[go-resarchnews]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71891"><![CDATA[Health and Medicine]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686657">  <title><![CDATA[IMS Launches Series on Interdisciplinary Innovation with AI Computing Panel ]]></title>  <uid>35272</uid>  <body><![CDATA[<p>The Institute for Matter and Systems (IMS) hosted the inaugural Boundaries and Breakthroughs<em>&nbsp;</em>panel on Nov. 11, setting the stage for a new era of interdisciplinary dialogue at Georgia Tech. The event, held in the Marcus Nanotechnology building, brought together experts in electrical engineering, computer architecture, and computer systems design to tackle one of today’s pressing challenges: artificial intelligence (AI) scalability and sustainable high-performance computing.</p><p>As one of Georgia Tech’s 11 interdisciplinary research institutes, IMS is designed to break down silos between traditional academic units. By operating core user facilities and fostering collaborative research, IMS creates a unique ecosystem where device-level innovation meets systems-level design. This event personified that mission by connecting researchers who typically work on different ends of the stack.</p><p>“We’re looking for opportunities to bring people together to have discussions that are both informative and potentially create a little bit of friction in the best possible way around trending topics in science and engineering,” said Mike Filler, IMS deputy director, during opening remarks.</p><p>The panel was moderated by <a href="http://ece.gatech.edu/directory/divya-mahajan">Divya Mahajan</a>, assistant professor in the School of Electrical and Computer Engineering, and featured <a href="https://moin.cc.gatech.edu/">Moinuddin Qureshi</a>, professor of computer science; <a href="https://www.scs.gatech.edu/people/anand-padmanabha-iyer">Anand Iyer</a>, assistant professor of computer science; and <a href="https://matter-systems.gatech.edu/people/asif-khan">Asif Khan</a>, associate professor in electrical and computer engineering.&nbsp;</p><p>The discussion explored the dynamics between compute abundance and energy constraints. As AI models scale up, power consumption has become a societal issue, driving up energy demands and even influencing political conversations. The panelists agreed that the bottleneck isn’t compute — a computer’s ability to process and execute tasks — but data movement. Moving data uses 100 to 1,000 times more energy than computation, making memory systems the critical frontier.</p><p>The conversation highlighted how breakthroughs in compute must occur at every layer — from individual devices to full computer systems. At the device level, Khan mentioned emerging memory technologies and “beyond CMOS” approaches such as embedding compute within memory and exploring bio-inspired architectures.</p><p>From a computer architecture level, Qureshi advocated rethinking interfaces and creating designs optimized for the future of computing. AI needs regular patterns to work optimally, and current patterns are not set up for that.</p><p>“If you want efficiency, design systems that make sense for AI,” Qureshi said. “Develop new interfaces, develop new modules, architectures, and organization that make for a specific pattern.”</p><p>At the systems level, Iyer stressed practical strategies like near-memory compute and energy-aware scheduling while acknowledging the need for co-design between hardware and software.</p><p>“Now in terms of brains or bio-inspired computing, my conjecture is that there is currently no hardware that is capable of doing it,” Khan said. He also noted that right now, there is no computer or algorithm that has the scale of computing comparable to human brain power.</p><p>The panelists didn’t shy away from provocative ideas — such as whether graphic processing units are the final solution for AI and whether matrix multiplication alone can lead to artificial general intelligence. While opinions varied, all agreed that organizations like IMS are key to bringing together diverse expertise to tackle these questions collaboratively.</p><p>The Boundaries and Breakthroughs series continues in <a href="https://matter-systems.gatech.edu/events/boundaries-breakthroughs-panel-series-bioelectronics-med-tech">January with a panel on bioelectronics and medical technologies</a>, reinforcing IMS’s commitment to fostering dialogue that spans the full spectrum of innovation.</p>]]></body>  <author>aneumeister3</author>  <status>1</status>  <created>1764608557</created>  <gmt_created>2025-12-01 17:02:37</gmt_created>  <changed>1764608619</changed>  <gmt_changed>2025-12-01 17:03:39</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The Boundaries and Breakthroughs panel explored how interdisciplinary collaboration can drive solutions for the future of artificial intelligence. ]]></teaser>  <type>news</type>  <sentence><![CDATA[The Boundaries and Breakthroughs panel explored how interdisciplinary collaboration can drive solutions for the future of artificial intelligence. ]]></sentence>  <summary><![CDATA[<p>The Boundaries and Breakthroughs panel explored how interdisciplinary collaboration can drive solutions for the future of artificial intelligence.&nbsp;</p>]]></summary>  <dateline>2025-12-01T00:00:00-05:00</dateline>  <iso_dateline>2025-12-01T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-12-01 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[amelia.neumeister@research.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:amelia.neumeister@research.gatech.edu">Amelia Neumeister</a> | Research Communications Program Manager</p><p>The Institute for Matter and Systems</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678737</item>      </media>  <hg_media>          <item>          <nid>678737</nid>          <type>image</type>          <title><![CDATA[BB_web_story.png]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[BB_web_story.png]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/12/01/BB_web_story.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/01/BB_web_story.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/12/01/BB_web_story.png?itok=4XXZjfDV]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Panelists speaking at the Boundaries and Breakthroughs panel series]]></image_alt>                    <created>1764608566</created>          <gmt_created>2025-12-01 17:02:46</gmt_created>          <changed>1764608566</changed>          <gmt_changed>2025-12-01 17:02:46</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="660369"><![CDATA[Matter and Systems]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="193652"><![CDATA[Matter and Systems]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686517">  <title><![CDATA[Ph.D. Student Making Digital Maps That Blind People Can Hear]]></title>  <uid>36530</uid>  <body><![CDATA[<p>“Map region. Graphic clickable. Blank.”</p><p>That’s usually the only information <a href="https://brandonkeithbiggs.com/"><strong>Brandon Biggs</strong></a> receives from digital maps.</p><p>Biggs is a human-centered computing Ph.D. student in Georgia Tech’s School of Interactive Computing. He is almost totally blind due to Leber’s Congenital Amaurosis (LCA), a rare degenerative eye disorder affecting about one in 40,000 people.</p><p>Based on his experience, Biggs argues that most digital maps aren’t accessible to people who are blind. Even worse, he said, the needs of the blind are usually overlooked.</p><p>“When I started research on maps, I had never viewed a weather, campus, or building map, so I didn’t realize the amount of information maps contain,” Biggs said. “How do you represent shapes, orientation, and layout through audio and translate that into a geographic map?”</p><p>To answer these questions, Biggs founded <a href="https://xrnavigation.io/"><strong>XRNavigation</strong></a>, a company focused on developing accessible digital tools. Its flagship product, Audiom, is a cross-sensory map that people can see and hear through text.</p><p>“Sighted people view about 300 maps per year, while blind people view fewer than one,” he said. “Blind people don’t view maps; it’s not part of their lives.</p><p>“I want to ensure that for blind users, digital maps are no longer just ‘blank.’&nbsp; They receive the information they need to know to navigate in this world and become more autonomous.”</p><p>Organizations that need to include accessible maps in their digital spaces can integrate Audiom into their website or app.&nbsp;</p><p>Georgia Tech recently became one such organization and used Audiom to introduce the first fully accessible digital campus map.</p><p>Professor <strong>Bruce Walker</strong> advises Biggs in Walker’s <a href="http://sonify.psych.gatech.edu/~walkerb/"><strong>Sonification Lab</strong></a>, which designs auditory displays for technologies.</p><p>“Brandon has the perfect and unique blend of technical skills, research savvy, innovativeness, lived experience, and never-stop attitude to tackle this problem while impacting and improving many lives,” Walker said.</p><h4><strong>Defining Accessibility</strong></h4><p>Biggs said most maps limit accessibility features to turn-by-turn directions, tables, or other kinds of alternative text that disregard spatial information. The ability to communicate spatial information distinguishes Audiom.</p><p>“According to Web Content Accessibility Guidelines (WCAG), all non-text content — like maps — must include a text alternative with an equivalent purpose,” Biggs said. “But what does ‘equivalent purpose’ mean for geographic maps?</p><p>“We argue that every single map, regardless of what it’s showing, communicates general spatialized information and relationships.”</p><p>Audiom also prioritizes the information that’s most important to blind users, including sidewalks and buildings.</p><p>“There’s a lot of information blind people just don’t get on maps but desperately need,” he said. “They couldn’t care less about the roads. They might need the road name, but they really need the sidewalks.</p><p>“If a blind person made a map, they might not even add the roads. And then they would add in the location of doorways, a critical detail that sighted people completely leave out.”</p><p>Biggs’s work is already gaining national recognition. XRNavigation was recently one of three companies selected by the Global Accessibility Awareness Day (GAAD) Foundation for a 2025 Gaady Award, which honors work being done to make digital technologies more accessible.</p><p>Past and present winners of <a href="https://gaad.foundation/what-we-do/gaadys"><strong>Gaady Awards </strong></a>range from tech startups to major brands like T-Mobile.</p><p>Biggs will accept the award during a banquet on Thursday in San Francisco.</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1763494008</created>  <gmt_created>2025-11-18 19:26:48</gmt_created>  <changed>1763494242</changed>  <gmt_changed>2025-11-18 19:30:42</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A Georgia Tech Ph.D. student who is nearly blind has developed Audiom, a cross-sensory digital map that translates spatial and geographic information into audio so that blind users can “hear” maps.]]></teaser>  <type>news</type>  <sentence><![CDATA[A Georgia Tech Ph.D. student who is nearly blind has developed Audiom, a cross-sensory digital map that translates spatial and geographic information into audio so that blind users can “hear” maps.]]></sentence>  <summary><![CDATA[<p>Brandon Biggs, a Georgia Tech Ph.D. student who is nearly blind, developed <strong>Audiom</strong>, a cross-sensory digital map that lets blind users navigate spatial information through audio. Biggs's tool, which Georgia Tech now uses for its campus map, emphasizes spatial relationships like sidewalks and buildings and gives organizations a way to integrate accessible, auditory maps into their own platforms.</p>]]></summary>  <dateline>2025-11-18T00:00:00-05:00</dateline>  <iso_dateline>2025-11-18T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-11-18 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678659</item>      </media>  <hg_media>          <item>          <nid>678659</nid>          <type>image</type>          <title><![CDATA[Brandon-Biggs_86A9112-copy_5.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Brandon-Biggs_86A9112-copy_5.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/11/18/Brandon-Biggs_86A9112-copy_5.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/11/18/Brandon-Biggs_86A9112-copy_5.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/11/18/Brandon-Biggs_86A9112-copy_5.jpg?itok=DVM0F57E]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Brandon Biggs]]></image_alt>                    <created>1763494016</created>          <gmt_created>2025-11-18 19:26:56</gmt_created>          <changed>1763494016</changed>          <gmt_changed>2025-11-18 19:26:56</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="42901"><![CDATA[Community]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="129"><![CDATA[Institute and Campus]]></category>      </categories>  <news_terms>          <term tid="42901"><![CDATA[Community]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="129"><![CDATA[Institute and Campus]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="360"><![CDATA[accessibility]]></keyword>          <keyword tid="172442"><![CDATA[Disabilites]]></keyword>          <keyword tid="47091"><![CDATA[maps]]></keyword>          <keyword tid="194036"><![CDATA[blindness]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686467">  <title><![CDATA[Researchers Find Opportunities for 311 Chatbots to Foster Community Engagement]]></title>  <uid>36530</uid>  <body><![CDATA[<p>311 chatbots make it easier for people to report issues to their local government without long wait times on the phone. However, a new study finds that the technology might inhibit civic engagement.</p><p>311 systems allow residents to report potholes, broken fire hydrants, and other municipal issues. In recent years, the use of artificial intelligence (AI) to provide 311 services to community residents has boomed across city and state governments. This includes an artificial virtual assistant (AVA) developed by third-party vendors for <a href="https://www.atlantaga.gov/government/departments/customer-service-atl311/atl311-chatbot"><strong>the City of Atlanta</strong></a> in 2023.</p><p>Through survey data, researchers from Tech’s School of Interactive Computing found that many residents are generally positive about 311 chatbots. In addition to eliminating long wait times over the phone, they also offer residents quick answers to permit applications, waste collection, and other frequently asked questions.</p><p>However, the study, which was conducted in Atlanta, indicates that 311 chatbots could be causing residents to feel isolated from public officials and less aware of what’s happening in their community.</p><p><strong>Jieyu Zhou</strong>, a Ph.D. student in the School of IC, said it doesn’t have to be that way.</p><h4><strong>Uniting Communities</strong></h4><p>Zhou and her advisor, Assistant Professor <a href="https://chrismaclellan.com/"><strong>Christopher MacLellan</strong></a>, published a paper at the 2025 ACM Designing Interactive Systems (DIS) Conference that focuses on improving public service chatbot design and amplifying their civic impact. They collaborated with Professor <a href="https://www.carldisalvo.com/"><strong>Carl DiSalvo</strong></a>, Associate Professor <a href="http://lynndombrowski.com/"><strong>Lynn Dombrowsk</strong></a>i, and graduate students <strong>Rui Shen</strong> and <a href="https://yueyu1030.github.io/"><strong>Yue You</strong></a>.</p><p>Zhou said 311 chatbots have the potential to be agents that drive community organization and improve quality of life.</p><p>“Current chatbots risk isolating users in their own experience,” Zhou said. “In the 311 system, people tend to report their own individual issues but lose a sense of what is happening in their broader community.&nbsp;</p><p>“People are very positive about these tools, but I think there’s an opportunity as we envision what civic chatbots could be. It’s important for us to emphasize that social element — engaging people&nbsp;within the community and connecting them with government representatives, community organizers, and other community members.”</p><p>Zhou and MacLellan said 311 chatbots can leave users wondering if others in their communities share their concerns.</p><p>“If people are at a town hall meeting, they can get a sense of whether the problems they are experiencing are shared by others,” Zhou said. “We can’t do that with a chatbot. It’s like an isolated room, and we’re trying to open the doors and the windows.”</p><h4><strong>Adding a Human Touch</strong></h4><p>In their paper, the researchers note that one of the biggest criticisms of 311 chatbots is they can’t replace interpersonal interaction.</p><p>Unlike chatbots, people working in local government offices are likely to:</p><ul><li>Have direct knowledge of issues</li><li>Provide appropriate referrals</li><li>Empathize with the resident’s concerns</li></ul><p>MacLellan said residents are likely to grow frustrated with a chatbot when reporting issues that require this level of contextual knowledge.</p><p>One person in the researchers’ survey noted that the chatbot they used didn’t understand that their report was about a sidewalk issue, not a street issue.</p><p>“Explaining such a situation to a human representative is straightforward,” MacLellan said. “However, when the issue being raised does not fall within any of the categories the chatbot is built to address, it often misinterprets the query and offers information that isn’t helpful.”</p><p>The researchers offer some design suggestions that can help chatbots foster community engagement and improve community well-being:</p><ul><li>Escalation. Regarding the sidewalk report, the chatbot did not offer a way to escalate the query to a human who could resolve it. Zhou said that this is a feature that chatbots should have but often lack.</li><li>Transparency. Chatbots could provide details about recent and frequently reported community issues. They should inform users early in the call process about known problems to help avoid an overload of user complaints.</li><li>Education. Chatbots can keep users updated about what’s happening in their communities.</li><li>Collective action. Chatbots can help communities organize and gather ideas to address challenges and solve problems.</li></ul><p>“Government agencies may focus mainly on fixing individual issues,” Zhou said, “But recognizing community-level patterns can inspire collective creativity. For example, one participant suggested that if many people report a broken swing at a playground, it could spark an initiative to design a new playground together—going far beyond just fixing it.”</p><p>These are just a few examples of things, the researchers argue, that 311 services were originally designed to achieve.</p><p>“Communities were already collaborating on identifying and reporting issues,” Zhou said. “These chatbots should reflect the original intentions and collaboration practices of the communities they serve.</p><p>“Our research suggests we can increase the positive impact of civic chatbots by including social aspects within the design of the system, connecting people, and building a community view.”</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1763152241</created>  <gmt_created>2025-11-14 20:30:41</gmt_created>  <changed>1763152550</changed>  <gmt_changed>2025-11-14 20:35:50</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[AI-powered 311 chatbots may unitentionally reduce residents' sense of connection within their community.]]></teaser>  <type>news</type>  <sentence><![CDATA[AI-powered 311 chatbots may unitentionally reduce residents' sense of connection within their community.]]></sentence>  <summary><![CDATA[<p>Researchers at the Georgia Institute of Technology found that while 311-style chatbots simplify the process of reporting municipal issues and reduce wait times, users can feel isolated from their community and less connected to broader civic awareness. They recommend redesigning these systems to include transparency about collective issues, provide pathways for human escalation, and support community-level action.</p>]]></summary>  <dateline>2025-11-14T00:00:00-05:00</dateline>  <iso_dateline>2025-11-14T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-11-14 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678639</item>      </media>  <hg_media>          <item>          <nid>678639</nid>          <type>image</type>          <title><![CDATA[Jieyu-Zhou_86A8161-Enhanced-NR.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Jieyu-Zhou_86A8161-Enhanced-NR.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/11/14/Jieyu-Zhou_86A8161-Enhanced-NR.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/11/14/Jieyu-Zhou_86A8161-Enhanced-NR.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/11/14/Jieyu-Zhou_86A8161-Enhanced-NR.jpg?itok=vlJ5wKyW]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Jieyu Zhou]]></image_alt>                    <created>1763152260</created>          <gmt_created>2025-11-14 20:31:00</gmt_created>          <changed>1763152260</changed>          <gmt_changed>2025-11-14 20:31:00</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="142"><![CDATA[City Planning, Transportation, and Urban Growth]]></category>          <category tid="42901"><![CDATA[Community]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="142"><![CDATA[City Planning, Transportation, and Urban Growth]]></term>          <term tid="42901"><![CDATA[Community]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="188776"><![CDATA[go-research]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="169137"><![CDATA[chatbot]]></keyword>          <keyword tid="189306"><![CDATA[public service technology]]></keyword>          <keyword tid="1134"><![CDATA[City of Atlanta]]></keyword>          <keyword tid="188933"><![CDATA[Atlanta community.]]></keyword>          <keyword tid="10614"><![CDATA[community organizing]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686466">  <title><![CDATA[Professor Earns Test-of-Time Award at AI and Computer Gaming Conference]]></title>  <uid>36530</uid>  <body><![CDATA[<p>One of the top conferences for AI and computer games is recognizing a School of Interactive Computing professor with its first-ever test-of-time award.</p><p>At its event this week in Alberta, Canada, the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE) is honoring Professor Mark Riedl. The award also honors University of Utah Professor and Division of Games Chair Michael Young, Riedl’s Ph.D. advisor.</p><p>Riedl studied under Young at North Carolina State University.</p><p>Their 2005 paper, <em>From Linear Story Generation to Branching Story Graphs</em>, highlighted the challenges of using AI to create interactive gaming narratives in which user actions influence the story’s progression.&nbsp;</p><p>In 2005, computer game systems that supported linear, non-branching games were widely used. Riedl introduced an innovative mathematical formula for interactive stories ranging from choose-your-own-adventure novels to modern computer games.</p><p>“We didn’t use the term ‘generative AI’ back then, but I was working on AI for the generation of creative artifacts,” Riedl said. “This was before we had practical deep learning or large language models.</p><p>“One of the reasons this paper is still relevant 20 years later is that it didn’t just present a technology, it attempted to provide a framework for solving a grand challenge in AI.”</p><p>That challenge is still ongoing, Riedl said. Game designers continue to struggle with balancing story coherence against the amount of narrative control afforded to users.</p><p>“When users exercise a high degree of control within the environment, it is likely that their actions will change the state of the world in ways that may interfere with the causal dependencies between actions as intended within a storyline,” Riedl and Young wrote in the paper.</p><p>“Narrative mediation makes linear narratives interactive. The question is: Is the expressive power of narrative mediation at least as powerful as the story graph representation?”</p><p>AIIDE is being held this week at the University of Alberta in Edmonton, Alberta. Riedl will receive the award on Wednesday.</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1763151663</created>  <gmt_created>2025-11-14 20:21:03</gmt_created>  <changed>1763151872</changed>  <gmt_changed>2025-11-14 20:24:32</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Professor Mark Riedl received the first-ever test-of-time award from the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE).]]></teaser>  <type>news</type>  <sentence><![CDATA[Professor Mark Riedl received the first-ever test-of-time award from the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE).]]></sentence>  <summary><![CDATA[<p>Professor Mark Riedl was honored with the first-ever test-of-time award by the AIIDE conference. The award recognizes their influential 2005 paper <em>From Linear Story Generation to Branching Story Graphs</em>, which addressed the challenge of using AI to create interactive, non-linear narratives in computer games. The paper introduced a mathematical framework that remains relevant today.</p>]]></summary>  <dateline>2025-11-12T00:00:00-05:00</dateline>  <iso_dateline>2025-11-12T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-11-12 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678638</item>      </media>  <hg_media>          <item>          <nid>678638</nid>          <type>image</type>          <title><![CDATA[Summit-on-Responsible-Computing--AI--Society_86A8505.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Summit-on-Responsible-Computing--AI--Society_86A8505.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/11/14/Summit-on-Responsible-Computing--AI--Society_86A8505.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/11/14/Summit-on-Responsible-Computing--AI--Society_86A8505.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/11/14/Summit-on-Responsible-Computing--AI--Society_86A8505.jpg?itok=PI-Zoshr]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mark Riedl]]></image_alt>                    <created>1763151672</created>          <gmt_created>2025-11-14 20:21:12</gmt_created>          <changed>1763151672</changed>          <gmt_changed>2025-11-14 20:21:12</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>      </news_terms>  <keywords>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="170453"><![CDATA[Test of Time Award]]></keyword>          <keyword tid="2356"><![CDATA[gaming]]></keyword>          <keyword tid="2450"><![CDATA[computer games]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686422">  <title><![CDATA[Ph.D. Student’s Framework Used to Bolster Nvidia’s Cosmos Predict-2 Model]]></title>  <uid>36530</uid>  <body><![CDATA[<p>A new deep learning architectural framework could boost the development and deployment efficiency of autonomous vehicles and humanoid robots. The framework will lower training costs and reduce the amount of real-world data needed for training.</p><p>World foundation models (WFMs) enable physical AI systems to learn and operate within&nbsp;synthetic worlds created by generative artificial intelligence (genAI). For example, these models use predictive capabilities to generate up to 30 seconds of video that accurately reflects the real world.</p><p>The new framework, developed by a Georgia Tech researcher, enhances the processing speed of the neural networks that simulate these real-world environments from text, images, or video inputs.</p><p>The neural networks that make up the architectures of large language models like ChatGPT and visual models like Sora process contextual information using the “attention mechanism.”</p><p>Attention refers to a model’s ability to focus on the most relevant parts of input.</p><p>The Neighborhood Attention Extension (NATTEN) allows models that require GPUs or high-performance computing systems to process information and generate outputs more efficiently.</p><p>Processing speeds can increase by up to 2.6 times, said <a href="https://alihassanijr.com/"><strong>Ali Hassani</strong></a>, a Ph.D. student in the School of Interactive Computing and the creator of NATTEN. Hassani is advised by Associate Professor <a href="https://www.humphreyshi.com/"><strong>Humphrey Shi</strong></a>.</p><p>Hassani is also a research scientist at Nvidia, where he introduced NATTEN to <a href="https://www.nvidia.com/en-us/ai/cosmos/"><strong>Cosmos</strong></a> — a family of WFMs the company uses to train robots, autonomous vehicles, and other physical AI applications.</p><p>“You can map just about anything from a prompt or an image or any combination of frames from an existing video to predict future videos,” Hassani said. “Instead of generating words with an LLM, you’re generating a world.</p><p>“Unlike LLMs that generate a single token at a time, these models are compute-heavy. They generate many images — often hundreds of frames at a time — so the models put a lot of work on the GPU. NATTEN lets us decrease some of that work and proportionately accelerate the model.”</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1763068438</created>  <gmt_created>2025-11-13 21:13:58</gmt_created>  <changed>1763068498</changed>  <gmt_changed>2025-11-13 21:14:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new deep learning architectural framework, Neighborhood Attention Extension (NATTEN), is being used by Nvidia to  increase the processing speed of their Cosmos Predict-2 Model for training autonomous vehicles and humanoid robots.]]></teaser>  <type>news</type>  <sentence><![CDATA[A new deep learning architectural framework, Neighborhood Attention Extension (NATTEN), is being used by Nvidia to  increase the processing speed of their Cosmos Predict-2 Model for training autonomous vehicles and humanoid robots.]]></sentence>  <summary><![CDATA[<p>Georgia Tech Ph.D. student Ali Hassani developed the Neighborhood Attention Extension (NATTEN), a deep learning architectural framework that is being integrated into Nvidia's Cosmos Predict-2 world foundation model. NATTEN enhances the processing speed of neural networks that simulate real-world environments for physical AI systems, which are used to train autonomous vehicles and humanoid robots.&nbsp;</p>]]></summary>  <dateline>2025-11-03T00:00:00-05:00</dateline>  <iso_dateline>2025-11-03T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-11-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678621</item>      </media>  <hg_media>          <item>          <nid>678621</nid>          <type>image</type>          <title><![CDATA[2X6A3487.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[2X6A3487.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/11/13/2X6A3487.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/11/13/2X6A3487.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/11/13/2X6A3487.jpg?itok=TTWF4N4h]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Humprhey Shi and Ali Hassani]]></image_alt>                    <created>1763068473</created>          <gmt_created>2025-11-13 21:14:33</gmt_created>          <changed>1763068473</changed>          <gmt_changed>2025-11-13 21:14:33</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="194609"><![CDATA[Industry]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="194609"><![CDATA[Industry]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="193860"><![CDATA[Artifical Intelligence]]></keyword>          <keyword tid="194701"><![CDATA[go-resarchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="14549"><![CDATA[nvidia]]></keyword>          <keyword tid="191138"><![CDATA[artificial neural networks]]></keyword>          <keyword tid="97281"><![CDATA[autonomous vehicles]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686408">  <title><![CDATA[Department Raises Thousands for Campus Food Pantry]]></title>  <uid>36253</uid>  <body><![CDATA[<div><div><p>The <a href="https://scp.cc.gatech.edu/">School of Cybersecurity and Privacy</a> (SCP) kicked off the season of giving early this year with a more than $2,000 food donation to <a href="https://star.studentlife.gatech.edu/klemis-kitchen">Klemis Kitchen</a>, Georgia Tech’s food bank.</p><p>The kitchen serves students in need with groceries or meals, and works to reduce food waste on campus.</p><p>"We are so grateful for this incredibly generous donation from the School of Cybersecurity and Privacy,” said <strong>Steven Fazenbaker</strong>, program director of Students’ Temporary Assistance and Resources (STAR).&nbsp;</p><p>“There are over 300 students with access to Klemis Kitchen, and this donation will go far in making sure these students have the food they need.”</p><p><strong>Mary Helen Hayes</strong>, SCP assistant director of financial operations, organized the food drive and spent October raising funds.</p><p>“Throughout the year, I look for ways to bring our SCP community together—faculty, staff, and students alike,” she said.</p><p>“When I learned that about 10% of Georgia Tech students experience food insecurity and 15% often prioritize working over academics and activities just to afford food, the Klemis Kitchen food drive became my focus.”</p><p>Hayes added she wanted everyone to contribute to the SCP fundraiser, so she offered to handle the shopping for anyone who wanted to give but didn’t have the time.</p><p>“Our team came together with incredible generosity and energy—organizing, purchasing, delivering, and coordinating every detail. Within just a few days, we raised over $1000, which was then doubled through an anonymous matching gift, bringing our total to $2,110.”</p><p>The amount of food the School was able to purchase filled two cars and required the staff to make several trips to unload. According to Fazenbaker, department donations like this help keep the food bank stocked.</p><p>“Klemis Kitchen relies 100% on donations - leftovers from the dining halls, donations from community partners like grocery stores and churches, food drives sponsored by departments across campus, and monetary donations that allow us to fill gaps when food donations are low,” he said.</p><p>“The Georgia Tech community always comes through. This program only works because of Georgia Tech's commitment to Progress and Service.”</p><p>Monetary donations to Klemis Kitchen can be made on the kitchen's <a href="https://star.studentlife.gatech.edu/donate">website</a>. &nbsp;Departments can sign up to sponsor food drives <a href="https://star.studentlife.gatech.edu/klemis-kitchen">here</a>.&nbsp;</p></div></div>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1763050579</created>  <gmt_created>2025-11-13 16:16:19</gmt_created>  <changed>1763050826</changed>  <gmt_changed>2025-11-13 16:20:26</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The School of Cybersecurity and Privacy (SCP) kicked off the season of giving early this year with a more than $2,000 food donation to Klemis Kitchen, Georgia Tech’s food bank.]]></teaser>  <type>news</type>  <sentence><![CDATA[The School of Cybersecurity and Privacy (SCP) kicked off the season of giving early this year with a more than $2,000 food donation to Klemis Kitchen, Georgia Tech’s food bank.]]></sentence>  <summary><![CDATA[<p>The <a href="https://scp.cc.gatech.edu/">School of Cybersecurity and Privacy</a> (SCP) kicked off the season of giving early this year with a more than $2,000 food donation to <a href="https://star.studentlife.gatech.edu/klemis-kitchen">Klemis Kitchen</a>, Georgia Tech’s food bank.</p><p>The kitchen serves students in need with groceries or meals, and works to reduce food waste on campus</p>]]></summary>  <dateline>2025-11-13T00:00:00-05:00</dateline>  <iso_dateline>2025-11-13T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-11-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jpopham3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Popham&nbsp;Communications Officer II | School of Cybersecurity and Privacy</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678613</item>      </media>  <hg_media>          <item>          <nid>678613</nid>          <type>image</type>          <title><![CDATA[Food-Drive-Banner.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Food-Drive-Banner.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/11/13/Food-Drive-Banner.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/11/13/Food-Drive-Banner.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/11/13/Food-Drive-Banner.jpg?itok=0r9_EhVo]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[1.A photo of a group of people standing behind a table full of packaged food. The group is smiling and represents a diverse crowd of faculty and staff.]]></image_alt>                    <created>1763050591</created>          <gmt_created>2025-11-13 16:16:31</gmt_created>          <changed>1763050591</changed>          <gmt_changed>2025-11-13 16:16:31</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="42901"><![CDATA[Community]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="194836"><![CDATA[Sustainability]]></category>      </categories>  <news_terms>          <term tid="42901"><![CDATA[Community]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="194836"><![CDATA[Sustainability]]></term>      </news_terms>  <keywords>          <keyword tid="344"><![CDATA[cyber]]></keyword>          <keyword tid="1404"><![CDATA[Cybersecurity]]></keyword>          <keyword tid="167018"><![CDATA[staff]]></keyword>          <keyword tid="1506"><![CDATA[faculty]]></keyword>          <keyword tid="4728"><![CDATA[donor]]></keyword>          <keyword tid="266"><![CDATA[donation]]></keyword>          <keyword tid="172646"><![CDATA[food drive]]></keyword>          <keyword tid="90451"><![CDATA[donation drives]]></keyword>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>          <term tid="39511"><![CDATA[Public Service, Leadership, and Policy]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686192">  <title><![CDATA[Built in I2P: The Student Inventions You’ll Want to See to Believe]]></title>  <uid>36436</uid>  <body><![CDATA[<p>Cricket powder-based protein brownies. A visualization system for fencing blades. A personalized AI application for analyzing blood work. All I2P Showcase prototypes. See what Georgia Tech students have been developing this semester at the <a href="https://www.eventbrite.com/e/i2p-showcase-fall-2025-tickets-1748117429289?aff=article">Fall 2025 Idea to Prototype (I2P) Showcase</a> on Tuesday, Dec. 2, at 5 p.m. in the Marcus Nanotechnology Building. This year, attendees will have even more&nbsp;original inventions to view, with over 60 teams&nbsp;displaying prototypes.&nbsp;</p><p>The event marks the culmination of the semester-long I2P course, where undergraduate students develop functional prototypes aimed at solving real-world problems. Prototypes this semester include a smart military drone, a gentler device for cervical cancer screening, a rotating espresso station, tools to keep AI safe, compact data centers, systems that simulate cyberattacks to help companies strengthen their defenses, and many more.&nbsp;</p><p>The showcase is free and open to students, faculty, staff, and members of the local community.&nbsp;</p><p>Winning teams will receive prizes and a “golden ticket” into CREATE-X’s Startup Launch, a summer accelerator that provides optional seed funding, accounting and legal service credits, mentorship, and more to help students turn their prototypes into viable startups.</p><p>This is a free event, and refreshments will be provided.&nbsp;<a href="https://www.eventbrite.com/e/i2p-showcase-fall-2025-tickets-1748117429289?aff=article">Register for the Fall 2025 I2P Showcase</a> today!</p>]]></body>  <author>bdurham31</author>  <status>1</status>  <created>1762288214</created>  <gmt_created>2025-11-04 20:30:14</gmt_created>  <changed>1762289146</changed>  <gmt_changed>2025-11-04 20:45:46</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech’s Fall 2025 I2P Showcase will feature over 60 student prototypes tackling real-world challenges.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech’s Fall 2025 I2P Showcase will feature over 60 student prototypes tackling real-world challenges.]]></sentence>  <summary><![CDATA[<p>More than 60 undergraduate teams will present functional prototypes at the Fall 2025 Idea to Prototype (I2P) Showcase at Georgia Tech, Tuesday, Dec. 2 at 5 p.m. in the Marcus Nanotechnology Building. See innovative student creations developed over the semester and designed to solve real-world problems. Winning teams earn prizes and a “golden ticket” into CREATE-X’s Startup Launch accelerator, which offers funding, in-kind services, mentorship, and more. This is a free event for the campus and local community.</p>]]></summary>  <dateline>2025-11-04T00:00:00-05:00</dateline>  <iso_dateline>2025-11-04T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-11-04 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[breanna.durham@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Breanna Durham</p><p>Marketing Strategist</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678542</item>      </media>  <hg_media>          <item>          <nid>678542</nid>          <type>image</type>          <title><![CDATA[Founders of Allez Go Adam Kulikowski and Jason Mo]]></title>          <body><![CDATA[<p>Founders of Allez Go: Adam Kulikowski and Jason Mo</p>]]></body>                      <image_name><![CDATA[54186413447_045f318b99_o.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/11/04/54186413447_045f318b99_o.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/11/04/54186413447_045f318b99_o.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/11/04/54186413447_045f318b99_o.jpg?itok=DP3h0kVk]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Founders of Allez Go: Adam Kulikowski and Jason Mo]]></image_alt>                    <created>1762288717</created>          <gmt_created>2025-11-04 20:38:37</gmt_created>          <changed>1762288817</changed>          <gmt_changed>2025-11-04 20:40:17</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.eventbrite.com/e/i2p-showcase-fall-2025-tickets-1748117429289?aff=article]]></url>        <title><![CDATA[Register for the 2025 Fall I2P Showcase]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="583966"><![CDATA[CREATE-X]]></group>          <group id="655285"><![CDATA[GT Commercialization]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="139"><![CDATA[Business]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="42921"><![CDATA[Exhibitions]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="194685"><![CDATA[Manufacturing]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>          <category tid="133"><![CDATA[Special Events and Guest Speakers]]></category>          <category tid="134"><![CDATA[Student and Faculty]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="139"><![CDATA[Business]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="42921"><![CDATA[Exhibitions]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="194685"><![CDATA[Manufacturing]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="148"><![CDATA[Music and Music Technology]]></term>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>          <term tid="133"><![CDATA[Special Events and Guest Speakers]]></term>          <term tid="134"><![CDATA[Student and Faculty]]></term>      </news_terms>  <keywords>          <keyword tid="192255"><![CDATA[go-commercializationnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="193658"><![CDATA[Commercialization]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686132">  <title><![CDATA[New Research Will Move Us Closer to a Passwordless Society]]></title>  <uid>36253</uid>  <body><![CDATA[<div><div><p>Although they are currently essential to online security and privacy, the days of relying on password protection may be numbered, thanks to Assistant Professor <strong>Frank Li</strong> and his National Science Foundation (NSF) CAREER Award project.</p><p>While passwords have security limitations and can be challenging to use, emerging technologies such as Fast IDentity Online 2 (<a href="https://www.microsoft.com/en-us/security/business/security-101/what-is-fido2">FIDO2</a>) and other passkey authentication methods provide strong security and usability. For example, if you have ever used your smartphone’s facial recognition feature to log in to your bank account instead of typing out the password, you have used a FIDO2 passkey.</p><p>Users and online services, however, have been slow to adopt the new technology despite the benefits. Li’s NSF CAREER Award project addresses this challenge. Along with advancing the technology, Li will also advocate for its use.</p><p>“We are not assuming that this technology is coming,” said Li. “It is already here. The challenge is to get people to use this technology.”&nbsp;</p><p>This up-and-coming technology has been part of Li’s research for some time. His prior work provided a new security analysis of the FIDO2 authentication protocol, which includes passkeys.&nbsp;</p><p>Li’s CAREER project will investigate real-world uses of FIDO2/passkeys and security and usability issues that can arise. A goal of his research is to identify and resolve problems before they become widespread and more difficult to address.&nbsp;</p><p>“There’s still a lot to do when it comes to authentication research, and there’s even more to be done with passkeys,” he said.&nbsp;</p><p>“Online authentication is a core function needed for online security. Making any changes to it will have huge implications. For example, accounts that send spam and phishing attacks are often accounts with compromised passwords. A <a href="https://www.forbes.com/sites/tonybradley/2025/05/01/are-we-finally-entering-a-passwordless-era/">passwordless future</a> will reduce that threat.”</p><p>The final component of Li’s CAREER Award is an educational outreach program. The NSF wants researchers to inspire the next generation of scientists as a part of their projects. Li plans to reach out to Atlanta high schools and engage their computer science programs.</p><p><a href="https://www.nsf.gov/funding/opportunities/career-faculty-early-career-development-program">NSF CAREER Awards</a> are prestigious federal grants given to early career academic faculty and are widely recognized as a career defining moment. Li’s project will be conducted in the School of Cybersecurity and Privacy as well as the School of Electrical and Computer Engineering.&nbsp;</p></div></div>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1762180558</created>  <gmt_created>2025-11-03 14:35:58</gmt_created>  <changed>1762180882</changed>  <gmt_changed>2025-11-03 14:41:22</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Although they are currently essential to online security and privacy, the days of relying on password protection may be numbered, thanks to Assistant Professor Frank Li and his National Science Foundation (NSF) CAREER Award project.]]></teaser>  <type>news</type>  <sentence><![CDATA[Although they are currently essential to online security and privacy, the days of relying on password protection may be numbered, thanks to Assistant Professor Frank Li and his National Science Foundation (NSF) CAREER Award project.]]></sentence>  <summary><![CDATA[<p>Although they are currently essential to online security and privacy, the days of relying on password protection may be numbered, thanks to Assistant Professor <strong>Frank Li</strong> and his National Science Foundation (NSF) CAREER Award project.</p>]]></summary>  <dateline>2025-11-03T00:00:00-05:00</dateline>  <iso_dateline>2025-11-03T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-11-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jpopham3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Popham&nbsp;Communications Officer II | School of Cybersecurity and Privacy</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678516</item>      </media>  <hg_media>          <item>          <nid>678516</nid>          <type>image</type>          <title><![CDATA[Frank-Li_86A0205-Enhanced-NR-copy.jpg]]></title>          <body><![CDATA[<p>Assistant Professor Frank Li standing outside of the Coda Building in Tech Square. <em>Photos by Terence Rushin/College of Computing</em></p>]]></body>                      <image_name><![CDATA[Frank-Li_86A0205-Enhanced-NR-copy.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/11/03/Frank-Li_86A0205-Enhanced-NR-copy.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/11/03/Frank-Li_86A0205-Enhanced-NR-copy.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/11/03/Frank-Li_86A0205-Enhanced-NR-copy.jpg?itok=MSO0AEyB]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A man standing outside in a building breezeway. He is wearing glasses, a blue polo and is smiling.]]></image_alt>                    <created>1762180596</created>          <gmt_created>2025-11-03 14:36:36</gmt_created>          <changed>1762180596</changed>          <gmt_changed>2025-11-03 14:36:36</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node></nodes>