<nodes> <node id="686192">  <title><![CDATA[Built in I2P: The Student Inventions You’ll Want to See to Believe]]></title>  <uid>36436</uid>  <body><![CDATA[<p>Cricket powder-based protein brownies. A visualization system for fencing blades. A personalized AI application for analyzing blood work. All I2P Showcase prototypes. See what Georgia Tech students have been developing this semester at the <a href="https://www.eventbrite.com/e/i2p-showcase-fall-2025-tickets-1748117429289?aff=article">Fall 2025 Idea to Prototype (I2P) Showcase</a> on Tuesday, Dec. 2, at 5 p.m. in the Marcus Nanotechnology Building. This year, attendees will have even more&nbsp;original inventions to view, with over 60 teams&nbsp;displaying prototypes.&nbsp;</p><p>The event marks the culmination of the semester-long I2P course, where undergraduate students develop functional prototypes aimed at solving real-world problems. Prototypes this semester include a smart military drone, a gentler device for cervical cancer screening, a rotating espresso station, tools to keep AI safe, compact data centers, systems that simulate cyberattacks to help companies strengthen their defenses, and many more.&nbsp;</p><p>The showcase is free and open to students, faculty, staff, and members of the local community.&nbsp;</p><p>Winning teams will receive prizes and a “golden ticket” into CREATE-X’s Startup Launch, a summer accelerator that provides optional seed funding, accounting and legal service credits, mentorship, and more to help students turn their prototypes into viable startups.</p><p>This is a free event, and refreshments will be provided.&nbsp;<a href="https://www.eventbrite.com/e/i2p-showcase-fall-2025-tickets-1748117429289?aff=article">Register for the Fall 2025 I2P Showcase</a> today!</p>]]></body>  <author>bdurham31</author>  <status>1</status>  <created>1762288214</created>  <gmt_created>2025-11-04 20:30:14</gmt_created>  <changed>1762289146</changed>  <gmt_changed>2025-11-04 20:45:46</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech’s Fall 2025 I2P Showcase will feature over 60 student prototypes tackling real-world challenges.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech’s Fall 2025 I2P Showcase will feature over 60 student prototypes tackling real-world challenges.]]></sentence>  <summary><![CDATA[<p>More than 60 undergraduate teams will present functional prototypes at the Fall 2025 Idea to Prototype (I2P) Showcase at Georgia Tech, Tuesday, Dec. 2 at 5 p.m. in the Marcus Nanotechnology Building. See innovative student creations developed over the semester and designed to solve real-world problems. Winning teams earn prizes and a “golden ticket” into CREATE-X’s Startup Launch accelerator, which offers funding, in-kind services, mentorship, and more. This is a free event for the campus and local community.</p>]]></summary>  <dateline>2025-11-04T00:00:00-05:00</dateline>  <iso_dateline>2025-11-04T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-11-04 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[breanna.durham@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Breanna Durham</p><p>Marketing Strategist</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678542</item>      </media>  <hg_media>          <item>          <nid>678542</nid>          <type>image</type>          <title><![CDATA[Founders of Allez Go Adam Kulikowski and Jason Mo]]></title>          <body><![CDATA[<p>Founders of Allez Go: Adam Kulikowski and Jason Mo</p>]]></body>                      <image_name><![CDATA[54186413447_045f318b99_o.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/11/04/54186413447_045f318b99_o.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/11/04/54186413447_045f318b99_o.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/11/04/54186413447_045f318b99_o.jpg?itok=DP3h0kVk]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Founders of Allez Go: Adam Kulikowski and Jason Mo]]></image_alt>                    <created>1762288717</created>          <gmt_created>2025-11-04 20:38:37</gmt_created>          <changed>1762288817</changed>          <gmt_changed>2025-11-04 20:40:17</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.eventbrite.com/e/i2p-showcase-fall-2025-tickets-1748117429289?aff=article]]></url>        <title><![CDATA[Register for the 2025 Fall I2P Showcase]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="583966"><![CDATA[CREATE-X]]></group>          <group id="655285"><![CDATA[GT Commercialization]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="139"><![CDATA[Business]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="42921"><![CDATA[Exhibitions]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="194685"><![CDATA[Manufacturing]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>          <category tid="133"><![CDATA[Special Events and Guest Speakers]]></category>          <category tid="134"><![CDATA[Student and Faculty]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="139"><![CDATA[Business]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="42921"><![CDATA[Exhibitions]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="194685"><![CDATA[Manufacturing]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="148"><![CDATA[Music and Music Technology]]></term>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>          <term tid="133"><![CDATA[Special Events and Guest Speakers]]></term>          <term tid="134"><![CDATA[Student and Faculty]]></term>      </news_terms>  <keywords>          <keyword tid="192255"><![CDATA[go-commercializationnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="193658"><![CDATA[Commercialization]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="683440">  <title><![CDATA[Sound Meets Code: Aleksandra Ma’s Music Tech Summer at MIT and Bose]]></title>  <uid>36761</uid>  <body><![CDATA[<p>Walk into any room Aleksandra Teng Ma’s been working in this summer, and you’ll probably hear a mix of experimental sounds, snippets of Amy Winehouse vocals, and the occasional Animal Crossing tune playing in the background. That’s just how her brain works—blending tech, artistry, and everyday play into something entirely her own.</p><p>Aleksandra is a master’s student in Music Technology at Georgia Tech, but “student” barely scratches the surface. This summer, she’s been everywhere—physically in Massachusetts and intellectually somewhere between a Pride performance and a human-AI jam session at MIT.</p><p>“I’m always with my microphone and MIDI keyboard,” she says, like it’s just second nature. “I love singing and coming up with tunes.”</p><p><strong>Live from MIT — It’s Human + AI Jamming</strong><br>Forget dusty textbooks and silent labs—Aleksandra’s research life is about real-time musical interactions between humans and AI. As a visiting researcher at MIT this summer, she’s digging into what it looks like when musicians "jam" with intelligent systems. Think futuristic band practice, but with algorithms joining in.</p><p>“It’s giving me a lot of exposure to co-design methodologies,” she explains, “and letting me observe how musicians respond to each other—and to AI.”</p><p>It’s not just code and theory, either. The insights come alive when she brings them to the stage. This summer, Aleksandra’s band performed at The Music Porch in Reading, MA for Pride Month. Their cover of <em>Pink Pony Club</em> turned into a moment she won’t forget.</p><p>“It was so fun seeing people—especially teenagers—singing and dancing together,” she says. “That’s one of those moments where I just thought, yep, this is why I picked music tech.”</p><p><strong>From Winehouse Covers to Ableton Experiments</strong><br>Despite her research chops, Aleksandra hasn’t lost touch with the joy of just making music. She sings and plays keyboard in a band, covers Amy Winehouse songs, and occasionally writes music just for fun. (Her dream studio partner? You guessed it: Amy herself.)</p><p>She’s also been expanding her technical toolkit this summer, diving deeper into sound design with Ableton and Serum.</p><p>“Still learning,” she says, “but I’m using them for sound design in songs—and loving it.”</p><p>And then there are the unexpected “whoa” moments. Like when she built a vocal patch for the Pixies’ <em>Where Is My Mind?</em> to use live during a performance.</p><p>“It was haunting,” she says. “And it worked so well live.”</p><p><strong>Dream Tech and Georgia Tech</strong><br>Ask Aleksandra what she’d invent if she could mash up two instruments, and she already has an idea:</p><p>“Automatic vocal effects through a microphone with a built-in amplifier,” she says, laughing. “Honestly, someone probably already made this, but I want it anyway.”</p><p>That kind of thinking is exactly what her time at Georgia Tech has sparked. Before the program, she saw music mostly through the lens of conventional instruments. Now? She’s all about how software and hardware can expand what music even is.</p><p><strong>Her Summer, in Sound</strong><br>If Aleksandra’s summer had a vibe, it’d be:</p><ul><li>A creek bubbling in the background</li><li>A long, ghostly reverb trail on a siren vocal</li><li>And the ever-cozy tones of Animal Crossing</li></ul><p>Not exactly your typical lab soundtrack—but that’s the beauty of it.</p><p>This fall, she’s heading back to Georgia Tech after a gap year at Bose, ready to jump into research on multimodal music source separation (AKA teaching machines to pick apart and understand layers in music the way humans do).</p><p>And yes, she’ll still be singing.</p><p><strong>Hits with Aleksandra</strong></p><ul><li>Current summer jams: <em>Rosebud</em> by Oklou &amp; the new Lorde album</li><li>What people don’t “get” about her work: “How music signals work on a granular level”</li></ul><p>Aleksandra Ma doesn’t just study music tech—she lives it. Whether she’s tweaking reverb patches, performing under porch lights, or teaching AI how to groove, she’s showing what it really means to be a 21st-century musician.</p>]]></body>  <author>malonso35</author>  <status>1</status>  <created>1753992286</created>  <gmt_created>2025-07-31 20:04:46</gmt_created>  <changed>1753992408</changed>  <gmt_changed>2025-07-31 20:06:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech Music Technology student Aleksandra Ma spent the summer researching human-AI jamming, performing live, and building new sounds.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech Music Technology student Aleksandra Ma spent the summer researching human-AI jamming, performing live, and building new sounds.]]></sentence>  <summary><![CDATA[<p>From human-AI jam sessions at MIT to live performances for Pride Month, for Georgia Tech's Music Technology student Aleksandra Ma, summer bridged music research, technology, and creative expression.</p>]]></summary>  <dateline>2025-07-31T00:00:00-04:00</dateline>  <iso_dateline>2025-07-31T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-07-31 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[Melissa.Alonso@design.gatech.edu]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>      </media>  <hg_media>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="42941"><![CDATA[Art Research]]></category>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="194568"><![CDATA[Arts and Performance]]></category>          <category tid="143"><![CDATA[Digital Media and Entertainment]]></category>          <category tid="42891"><![CDATA[Georgia Tech Arts]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>          <category tid="42931"><![CDATA[Performances]]></category>          <category tid="42951"><![CDATA[Student Art]]></category>      </categories>  <news_terms>          <term tid="42941"><![CDATA[Art Research]]></term>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="194568"><![CDATA[Arts and Performance]]></term>          <term tid="143"><![CDATA[Digital Media and Entertainment]]></term>          <term tid="42891"><![CDATA[Georgia Tech Arts]]></term>          <term tid="148"><![CDATA[Music and Music Technology]]></term>          <term tid="42931"><![CDATA[Performances]]></term>          <term tid="42951"><![CDATA[Student Art]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="1309"><![CDATA[music technology]]></keyword>          <keyword tid="1621"><![CDATA[georgia tech music technology]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="681961">  <title><![CDATA[Thesis on Human-Centered AI Earns Honors from International Computing Organization]]></title>  <uid>36319</uid>  <body><![CDATA[<p>A Georgia Tech alum’s dissertation introduced ways to make artificial intelligence (AI) more accessible, interpretable, and accountable. Although it’s been a year since his doctoral defense,&nbsp;<a href="https://zijie.wang/"><strong>Zijie (Jay) Wang</strong></a>’s (Ph.D. ML-CSE 2024) work continues to resonate with researchers.</p><p>Wang is a recipient of the&nbsp;<a href="https://medium.com/sigchi/announcing-the-2025-acm-sigchi-awards-17c1feaf865f"><strong>2025 Outstanding Dissertation Award from the Association for Computing Machinery Special Interest Group on Computer-Human Interaction (ACM SIGCHI)</strong></a>. The award recognizes Wang for his lifelong work on democratizing human-centered AI.</p><p>“Throughout my Ph.D. and industry internships, I observed a gap in existing research: there is a strong need for practical tools for applying human-centered approaches when designing AI systems,” said Wang, now a safety researcher at OpenAI.</p><p>“My work not only helps people understand AI and guide its behavior but also provides user-friendly tools that fit into existing workflows.”</p><p>[Related: <a href="https://sites.gatech.edu/research/chi-2025/">Georgia Tech College of Computing Swarms to Yokohama, Japan, for CHI 2025</a>]</p><p>Wang’s dissertation presented techniques in visual explanation and interactive guidance to align AI models with user knowledge and values. The work culminated from years of research, fellowship support, and internships.</p><p>Wang’s most influential projects formed the core of his dissertation. These included:</p><ul><li><a href="https://poloclub.github.io/cnn-explainer/"><strong>CNN Explainer</strong></a>: an open-source tool developed for deep-learning beginners. Since its release in July 2020, more than 436,000 global visitors have used the tool.</li><li><a href="https://poloclub.github.io/diffusiondb/"><strong>DiffusionDB</strong></a>: a first-of-its-kind large-scale dataset that lays a foundation to help people better understand generative AI. This work could lead to new research in detecting deepfakes and designing human-AI interaction tools to help people more easily use these models.</li><li><a href="https://interpret.ml/gam-changer/"><strong>GAM Changer</strong></a>: an interface that empowers users in healthcare, finance, or other domains to edit ML models to include knowledge and values specific to their domain, which improves reliability.</li><li><a href="https://www.jennwv.com/papers/gamcoach.pdf"><strong>GAM Coach</strong></a>: an interactive ML tool that could help people who have been rejected for a loan by automatically letting an applicant know what is needed for them to receive loan approval. </li><li><a href="https://www.cc.gatech.edu/news/new-tool-teaches-responsible-ai-practices-when-using-large-language-models"><strong>Farsight</strong></a>: a tool that alerts developers when they write prompts in large language models that could be harmful and misused. &nbsp;</li></ul><p>“I feel extremely honored and lucky to receive this award, and I am deeply grateful to many who have supported me along the way, including Polo, mentors, collaborators, and friends,” said Wang, who was advised by School of Computational Science and Engineering (CSE) Professor&nbsp;<a href="https://poloclub.github.io/polochau/"><strong>Polo Chau</strong></a>.</p><p>“This recognition also inspired me to continue striving to design and develop easy-to-use tools that help everyone to easily interact with AI systems.”</p><p>Like Wang, Chau advised Georgia Tech alumnus&nbsp;<a href="https://fredhohman.com/">Fred Hohman</a> (Ph.D. CSE 2020).&nbsp;<a href="https://www.cc.gatech.edu/news/alumnus-building-legacy-through-dissertation-and-mentorship">Hohman won the ACM SIGCHI Outstanding Dissertation Award in 2022</a>.</p><p><a href="https://poloclub.github.io/">Chau’s group</a> synthesizes machine learning (ML) and visualization techniques into scalable, interactive, and trustworthy tools. These tools increase understanding and interaction with large-scale data and ML models.&nbsp;</p><p>Chau is the associate director of corporate relations for the Machine Learning Center at Georgia Tech. Wang called the School of CSE his home unit while a student in the ML program under Chau.</p><p>Wang is one of five recipients of this year’s award to be presented at the 2025 Conference on Human Factors in Computing Systems (<a href="https://chi2025.acm.org/">CHI 2025</a>). The conference occurs April 25-May 1 in Yokohama, Japan.&nbsp;</p><p>SIGCHI is the world’s largest association of human-computer interaction professionals and practitioners. The group sponsors or co-sponsors 26 conferences, including CHI.</p><p>Wang’s outstanding dissertation award is the latest recognition of a career decorated with achievement.</p><p>Months after graduating from Georgia Tech,&nbsp;<a href="https://www.cc.gatech.edu/news/research-ai-safety-lands-recent-graduate-forbes-30-under-30">Forbes named Wang to its 30 Under 30 in Science for 2025</a> for his dissertation. Wang was one of 15 Yellow Jackets included in nine different 30 Under 30 lists and the only Georgia Tech-affiliated individual on the 30 Under 30 in Science list.</p><p>While a Georgia Tech student, Wang earned recognition from big names in business and technology. He received the&nbsp;<a href="https://www.cc.gatech.edu/news/student-named-apple-scholar-connecting-people-machine-learning">Apple Scholars in AI/ML Ph.D. Fellowship in 2023</a> and was in the&nbsp;<a href="https://www.cc.gatech.edu/news/georgia-tech-machine-learning-students-earn-jp-morgan-ai-phd-fellowships">2022 cohort of the J.P. Morgan AI Ph.D. Fellowships Program</a>.</p><p>Along with the CHI award, Wang’s dissertation earned him awards this year at banquets across campus. The&nbsp;<a href="https://bpb-us-e1.wpmucdn.com/sites.gatech.edu/dist/0/283/files/2025/03/2025-Sigma-Xi-Research-Award-Winners.pdf">Georgia Tech chapter of Sigma Xi presented Wang with the Best Ph.D. Thesis Award</a>. He also received the College of Computing’s Outstanding Dissertation Award.</p><p>“Georgia Tech attracts many great minds, and I’m glad that some, like Jay, chose to join our group,” Chau said. “It has been a joy to work alongside them and witness the many wonderful things they have accomplished, and with many more to come in their careers.”</p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1745331886</created>  <gmt_created>2025-04-22 14:24:46</gmt_created>  <changed>1745332147</changed>  <gmt_changed>2025-04-22 14:29:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[ Zijie (Jay) Wang (Ph.D. ML-CSE 2024) is a recipient of the 2025 Outstanding Dissertation Award from the Association for Computing Machinery Special Interest Group on Computer-Human Interaction (ACM SIGCHI).]]></teaser>  <type>news</type>  <sentence><![CDATA[ Zijie (Jay) Wang (Ph.D. ML-CSE 2024) is a recipient of the 2025 Outstanding Dissertation Award from the Association for Computing Machinery Special Interest Group on Computer-Human Interaction (ACM SIGCHI).]]></sentence>  <summary><![CDATA[<p>A Georgia Tech alum’s dissertation introduced ways to make artificial intelligence (AI) more accessible, interpretable, and accountable. Although it’s been a year since his doctoral defense,&nbsp;<a href="https://zijie.wang/"><strong>Zijie (Jay) Wang</strong></a>’s (Ph.D. ML-CSE 2024) work continues to resonate with researchers.</p><p>Wang is a recipient of the&nbsp;<a href="https://medium.com/sigchi/announcing-the-2025-acm-sigchi-awards-17c1feaf865f"><strong>2025 Outstanding Dissertation Award from the Association for Computing Machinery Special Interest Group on Computer-Human Interaction (ACM SIGCHI)</strong></a>. The award recognizes Wang for his lifelong work on democratizing human-centered AI.</p>]]></summary>  <dateline>2025-04-17T00:00:00-04:00</dateline>  <iso_dateline>2025-04-17T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-04-17 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>676903</item>          <item>673947</item>      </media>  <hg_media>          <item>          <nid>676903</nid>          <type>image</type>          <title><![CDATA[Jay-Wang-SIGCHI-Dissertation-Award.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Jay-Wang-SIGCHI-Dissertation-Award.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/04/22/Jay-Wang-SIGCHI-Dissertation-Award.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/04/22/Jay-Wang-SIGCHI-Dissertation-Award.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/04/22/Jay-Wang-SIGCHI-Dissertation-Award.jpg?itok=BwjW7CxH]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Zijie (Jay) Wang CHI 2025]]></image_alt>                    <created>1745331896</created>          <gmt_created>2025-04-22 14:24:56</gmt_created>          <changed>1745331896</changed>          <gmt_changed>2025-04-22 14:24:56</gmt_changed>      </item>          <item>          <nid>673947</nid>          <type>image</type>          <title><![CDATA[Farsight CHI.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Farsight CHI.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/05/05/Farsight%20CHI.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/05/05/Farsight%20CHI.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/05/05/Farsight%2520CHI.jpg?itok=hWo1VxQt]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[CHI 2024 Farsight]]></image_alt>                    <created>1714954253</created>          <gmt_created>2024-05-06 00:10:53</gmt_created>          <changed>1714954253</changed>          <gmt_changed>2024-05-06 00:10:53</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/thesis-human-centered-ai-earns-honors-international-computing-organization]]></url>        <title><![CDATA[Thesis on Human-Centered AI Earns Honors from International Computing Organization]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50877"><![CDATA[School of Computational Science and Engineering]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="155"><![CDATA[Congressional Testimony]]></category>          <category tid="143"><![CDATA[Digital Media and Entertainment]]></category>          <category tid="131"><![CDATA[Economic Development and Policy]]></category>          <category tid="42911"><![CDATA[Education]]></category>          <category tid="144"><![CDATA[Energy]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="42921"><![CDATA[Exhibitions]]></category>          <category tid="42891"><![CDATA[Georgia Tech Arts]]></category>          <category tid="179356"><![CDATA[Industrial Design]]></category>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="132"><![CDATA[Institute Leadership]]></category>          <category tid="194248"><![CDATA[International Education]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>          <category tid="42931"><![CDATA[Performances]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="151"><![CDATA[Policy, Social Sciences, and Liberal Arts]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>          <category tid="133"><![CDATA[Special Events and Guest Speakers]]></category>          <category tid="193157"><![CDATA[Student Honors and Achievements]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="155"><![CDATA[Congressional Testimony]]></term>          <term tid="143"><![CDATA[Digital Media and Entertainment]]></term>          <term tid="131"><![CDATA[Economic Development and Policy]]></term>          <term tid="42911"><![CDATA[Education]]></term>          <term tid="144"><![CDATA[Energy]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="42921"><![CDATA[Exhibitions]]></term>          <term tid="42891"><![CDATA[Georgia Tech Arts]]></term>          <term tid="179356"><![CDATA[Industrial Design]]></term>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="132"><![CDATA[Institute Leadership]]></term>          <term tid="194248"><![CDATA[International Education]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="148"><![CDATA[Music and Music Technology]]></term>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>          <term tid="42931"><![CDATA[Performances]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="151"><![CDATA[Policy, Social Sciences, and Liberal Arts]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>          <term tid="133"><![CDATA[Special Events and Guest Speakers]]></term>          <term tid="193157"><![CDATA[Student Honors and Achievements]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="166983"><![CDATA[School of Computational Science and Engineering]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="679981">  <title><![CDATA[Deep Startups with S.K. Sharma: Transforming Music With AI and Data Science]]></title>  <uid>36436</uid>  <body><![CDATA[<p>CREATE-X is set to host its next Deep Startups panel event on Thursday, Jan. 30, at 7 p.m. in the Marcus Nanotechnology Building Rooms 1116– 1118. The event will feature S.K. Sharma — former chief analytics and AI officer at Universal Music Group — and an expert in AI, data science, and strategic analytics. During Deep Startups, Sharma will dive into startup development within the context of the music business industry. Seating is limited. Students can <a href="https://gatech.campuslabs.com/engage/event/10832322">register for Deep Startups on Engage</a>. Faculty, staff, and the general public can <a href="https://www.eventbrite.com/e/deep-startups-sk-sharma-tickets-1205832149419?aff=dailydigest">register for Deep Startups on Eventbrite</a>.&nbsp;</p><p>Deep Startups is a series that brings together knowledgeable entrepreneurs and Startup Launch alumni from various business sectors to discuss their experiences forming companies that address significant, contemporary challenges. Attendees spend an informative evening discovering the intersection of technology and entrepreneurship.</p><p>From 2016 until recently, S.K. Sharma led a global team of Ph.D. data scientists, engineers, and strategists at Universal Music Group (UMG) to develop innovative and scalable solutions that drive real-time market insights and audience engagement. His leadership has been instrumental in creating differentiated intellectual property and market-leading capabilities in AI, machine learning, and prescriptive analytics, earning him multiple patents in marketing analytics.</p><p>Sharma's academic background includes a Ph.D. in chemical physics and physical chemistry from Caltech. His research has been published in numerous peer-reviewed journals, and he has held concurrent roles in academia and industry, including senior research scientist at Caltech's Beckman Institute. His corporate career includes significant positions such as vice president at Lehman Brothers, executive director at UBS, and vice president and partner at Mitchell Madison Group, where he advised global private equity funds and venture capital managers.</p><p>In addition to his role at UMG, Sharma is an entrepreneur in residence at UC San Diego's Office of Innovation and Commercialization, where he supports pioneering advancements in science and engineering. He is also an investor at Provisio Medical, a company revolutionizing endovascular procedures with its Sonic Lumen Tomography technology.</p><p>Sharma's contributions to the field of AI and analytics have been widely recognized. He was awarded <em>Billboard</em> magazine's 40 Under 40 and has been a commencement speaker at UC San Diego's Jacobs School of Engineering. His work in developing AI-driven marketing technologies has set new standards in the industry, ensuring compliance with global privacy regulations while driving significant improvements in marketing efficiency.</p><p>Attendees of Deep Startups will hear practical knowledge and actionable advice on entrepreneurship from Sharma. Each CREATE-X event is an opportunity to network, build ideas, and prepare for the <a href="https://create-x.gatech.edu/launch/startup-launch"><strong>Startup Launch</strong></a> program, which provides $5,000 in optional seed funding, $150,000 in in-kind services, mentorship, entrepreneurial workshops, networking events, and resources to help build and scale startups. Students, faculty, researchers, and alumni interested in developing their own startups are encouraged to apply. The deadline to&nbsp;<a href="https://create-x.gatech.edu/launch/startup-launch"><strong>apply for Startup Launch</strong></a>&nbsp;is March 17, 2025. Spots are limited. <a href="https://create-x.gatech.edu/launch/startup-launch"><strong>Apply now</strong></a>&nbsp;for a higher chance of acceptance and early feedback. If you have any questions about getting started, email us at create-x@groups.gatech.edu.</p>]]></body>  <author>bdurham31</author>  <status>1</status>  <created>1737991460</created>  <gmt_created>2025-01-27 15:24:20</gmt_created>  <changed>1737992637</changed>  <gmt_changed>2025-01-27 15:43:57</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[CREATE-X will host a Deep Startups fireside chat featuring S.K. Sharma, former chief analytics and AI officer at Universal Music Group,  on Jan. 30, focusing on startup development in the music industry.]]></teaser>  <type>news</type>  <sentence><![CDATA[CREATE-X will host a Deep Startups fireside chat featuring S.K. Sharma, former chief analytics and AI officer at Universal Music Group,  on Jan. 30, focusing on startup development in the music industry.]]></sentence>  <summary><![CDATA[<p>CREATE-X will host a Deep Startups fireside chat featuring S.K. Sharma, former chief analytics and AI officer at Universal Music Group, &nbsp;on Thursday, Jan. 30, at 7 p.m. in the Marcus Nanotechnology Building Rooms 1116 – 1118. During Deep Startups, Sharma will dive into startup development within the context of the music business industry. Sharma is a serial entrepreneur with four $100M+ exits for companies he either co-founded or where he served as an operational partner.</p>]]></summary>  <dateline>2025-01-27T00:00:00-05:00</dateline>  <iso_dateline>2025-01-27T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-01-27 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[breanna.durham@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Breanna Durham</p><p>Marketing Strategist</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>676143</item>      </media>  <hg_media>          <item>          <nid>676143</nid>          <type>image</type>          <title><![CDATA[Deep Startups: S.K. Sharma]]></title>          <body><![CDATA[<p><strong>Pictured S.K. Sharma Deep Startups Poster, with headshot and the following: S.K. Sharma, Former Chief Analytics and AI Officer at Universal Music Group,  Deep Startups, Jan. 30, 7p.m. Marcus Nano 1116-1118, Join CREATE-X for a discussion on developing startups with AI, data science, and strategic analytics, from a music business lens.</strong></p>]]></body>                      <image_name><![CDATA[Updated Deep Startups Jan. 2025 Eventbrite (2160 x 1080 px) (1).png]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/01/27/Updated%20Deep%20Startups%20Jan.%202025%20Eventbrite%20%282160%20x%201080%20px%29%20%281%29.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/01/27/Updated%20Deep%20Startups%20Jan.%202025%20Eventbrite%20%282160%20x%201080%20px%29%20%281%29.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/01/27/Updated%2520Deep%2520Startups%2520Jan.%25202025%2520Eventbrite%2520%25282160%2520x%25201080%2520px%2529%2520%25281%2529.png?itok=4lguKN46]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Poster featuring S.K. Sharma, former Chief Analytics and AI Officer at Universal Music Group, promoting the Deep Startups event on January 30 at 7 p.m. in Marcus Nano Rooms 1116-1118. The event, hosted by CREATE-X, will discuss developing startups using AI, data science, and strategic analytics within the music industry]]></image_alt>                    <created>1737992458</created>          <gmt_created>2025-01-27 15:40:58</gmt_created>          <changed>1737992584</changed>          <gmt_changed>2025-01-27 15:43:04</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://gatech.campuslabs.com/engage/event/10832322]]></url>        <title><![CDATA[Deep Startups: S.K. Startups Student Registration]]></title>      </link>          <link>        <url><![CDATA[https://www.eventbrite.com/e/deep-startups-sk-sharma-tickets-1205832149419?aff=dailydigest]]></url>        <title><![CDATA[Deep Startups: S.K. Startups Public Registration]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="583966"><![CDATA[CREATE-X]]></group>          <group id="655285"><![CDATA[GT Commercialization]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="139"><![CDATA[Business]]></category>          <category tid="42911"><![CDATA[Education]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>          <category tid="133"><![CDATA[Special Events and Guest Speakers]]></category>      </categories>  <news_terms>          <term tid="139"><![CDATA[Business]]></term>          <term tid="42911"><![CDATA[Education]]></term>          <term tid="148"><![CDATA[Music and Music Technology]]></term>          <term tid="133"><![CDATA[Special Events and Guest Speakers]]></term>      </news_terms>  <keywords>          <keyword tid="2835"><![CDATA[ai]]></keyword>          <keyword tid="92811"><![CDATA[data science]]></keyword>          <keyword tid="194259"><![CDATA[startup development]]></keyword>          <keyword tid="59661"><![CDATA[music industry]]></keyword>          <keyword tid="3472"><![CDATA[entrepreneurship]]></keyword>          <keyword tid="137161"><![CDATA[CREATE-X]]></keyword>          <keyword tid="194260"><![CDATA[S.K. Sharma]]></keyword>          <keyword tid="194261"><![CDATA[Universal Music Group]]></keyword>          <keyword tid="1144"><![CDATA[networking]]></keyword>          <keyword tid="341"><![CDATA[innovation]]></keyword>          <keyword tid="623"><![CDATA[Technology]]></keyword>          <keyword tid="194262"><![CDATA[event registration]]></keyword>          <keyword tid="14601"><![CDATA[mentorship]]></keyword>          <keyword tid="167944"><![CDATA[seed funding]]></keyword>          <keyword tid="194228"><![CDATA[entrepreneurial workshops]]></keyword>          <keyword tid="2161"><![CDATA[founders]]></keyword>      </keywords>  <core_research_areas>          <term tid="193658"><![CDATA[Commercialization]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="675196">  <title><![CDATA[Middle Schoolers’ Feedback Informs New Approach to AI-based Museum Exhibits]]></title>  <uid>36530</uid>  <body><![CDATA[<p>Researchers at Georgia Tech are creating accessible museum exhibits that explain artificial intelligence (AI) to middle school students, including the LuminAI interactive AI-based dance partner developed by Regents' Professor Brian Magerko.</p><p>Ph.D. students Yasmine Belghith and Atefeh Mahdavi co-led a study in a museum setting that observed how middle schoolers interact with the popular AI chatbot ChatGPT.&nbsp;</p><p>“It’s important for museums, especially science museums, to start incorporating these kinds of exhibits about AI and about using AI so the general population can have that avenue to interact with it and transfer that knowledge to everyday tools,” Belghith said.</p><p>Belghith and Mahdavi conducted their study with nine focus groups of 24 students at Chicago’s <a href="https://www.msichicago.org/"><strong>Museum of Science and Industry</strong></a>. The team used the findings to inform their design of AI exhibits that the museum could display as early as 2025.&nbsp;</p><p>Belghith is a Ph.D. student in human-centered computing. Her advisor is Assistant Professor Jessica Roberts in the School of Interactive Computing. Magerko advises Mahdavi, a Ph.D. student in digital media in the School of Literature, Media, and Communication.</p><p>Belghith and Mahdavi presented a paper about their study in May at the Association for Computing Machinery (ACM) 2024 Conference on Human Factors in Computing Systems (CHI) in Honolulu, Hawaii.</p><p>Their work is part of a National Science Foundation (NSF) grant dedicated to fostering AI literacy among middle schoolers in informal environments.</p><h4><strong>Expanding Accessibility</strong></h4><p>While there are existing efforts to reach students in the classroom, the researchers believe AI education is most accessible in informal learning environments like museums.</p><p>“There’s a need today for everybody to have some sort of AI literacy,” Belghith said. “Many middle schoolers will not be taking computer science courses or pursuing computer science careers, so there needs to be interventions to teach them what they should know about AI.”</p><p>The researchers found that most of the middle schoolers interacted with ChatGPT to either test its knowledge by prompting it to answer questions or socialize with it by having human-like conversations.&nbsp;</p><p>Others fit the mold of “content explorers.” They did not engage with the AI aspect of ChatGPT and focused more on the content it produced.</p><p>Mahdavi said regardless of their approach, students would get “tunnel vision” in their interactions instead of exploring more of the AI’s capabilities.</p><p>“If they go in a certain direction, they will continue to explore that,” Mahdavi said. “One thing we can learn from this is to nudge kids and show them there are other things you can do with AI tools or get them to think about it another way.”</p><p>The researchers also paid attention to what was missing in the students’ responses, which Mahdavi said was just as important as what they did talk about.</p><p>“None of them mentioned anything about ethics or what could be problematic about AI,” she said. “That told us there’s something they aren’t thinking about but should be. We take that into account as we think about future exhibits.”</p><h4><strong>Making an Impact</strong></h4><p>The researchers visited the Museum of Science and Industry June 1-2 to conduct the first trial run of three AI-based exhibits they’ve created. One of them is LuminAI, which was developed in <a href="https://expressivemachinery.gatech.edu/"><strong>Magerko’s Expressive Machinery Lab</strong></a>.</p><p>LuminAI is an interactive art installation that allows people to engage in collaborative movement with an AI dance partner. Georgia Tech and Kennesaw State recently held the <a href="https://www.kennesaw.edu/arts/news/posts/lumin_ai_performance_collaboration.php"><strong>first performance</strong></a> of AI avatars dancing with human partners in front of a live audience.</p><p>Duri Long, a former Georgia Tech Ph.D. student who is now an assistant professor at Northwestern University, designed the second exhibit. KnowledgeNet is an interactive tabletop exhibit in which visitors build semantic networks by adding different characteristics to characters that interact together.</p><p>The third exhibit, Data Bites, prompts users to build datasets of pizzas and sandwiches. Their selections train a machine-learning classifier in real time.</p><p>Belghith said the exhibits fostered conversations about AI between parents and children.</p><p>“The exhibit prototypes successfully engaged children in creative activities,” she said. “Many parents had to pull their kids away to continue their museum tour because the kids wanted more time to try different creations or dance moves.”</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1719255805</created>  <gmt_created>2024-06-24 19:03:25</gmt_created>  <changed>1721225131</changed>  <gmt_changed>2024-07-17 14:05:31</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Partnering with Chicago's Museum of Science and Industry, Researchers at Georgia Tech are creating accessible museum exhibits that explain artificial intelligence (AI) to middle school students.]]></teaser>  <type>news</type>  <sentence><![CDATA[Partnering with Chicago's Museum of Science and Industry, Researchers at Georgia Tech are creating accessible museum exhibits that explain artificial intelligence (AI) to middle school students.]]></sentence>  <summary><![CDATA[<p>Researchers at Georgia Tech are creating accessible museum exhibits that explain artificial intelligence (AI) to middle school students, including the LuminAI interactive AI-based dance partner developed by Regents' Professor Brian Magerko.</p><p>Ph.D. students Yasmine Belghith and Atefeh Mahdavi co-led a study in a museum setting that observed how middle schoolers interact with the popular AI chatbot ChatGPT.&nbsp;</p><p>Belghith and Mahdavi conducted their study with nine focus groups of 24 students at Chicago’s <a href="https://www.msichicago.org/"><strong>Museum of Science and Industry</strong></a>. The team used the findings to inform their design of AI exhibits that the museum could display as early as 2025.&nbsp;</p>]]></summary>  <dateline>2024-06-21T00:00:00-04:00</dateline>  <iso_dateline>2024-06-21T00:00:00-04:00</iso_dateline>  <gmt_dateline>2024-06-21 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Nathan Deen</p><p>Communications Officer I</p><p>School of Interactive Computing</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>674234</item>      </media>  <hg_media>          <item>          <nid>674234</nid>          <type>image</type>          <title><![CDATA[RS5939_COTA_240502_AIDance_MY_0368.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[RS5939_COTA_240502_AIDance_MY_0368.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/06/24/RS5939_COTA_240502_AIDance_MY_0368.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/06/24/RS5939_COTA_240502_AIDance_MY_0368.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/06/24/RS5939_COTA_240502_AIDance_MY_0368.jpg?itok=2UhdHxf2]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[LuminAI performance]]></image_alt>                    <created>1719255844</created>          <gmt_created>2024-06-24 19:04:04</gmt_created>          <changed>1719255844</changed>          <gmt_changed>2024-06-24 19:04:04</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="42901"><![CDATA[Community]]></category>          <category tid="42911"><![CDATA[Education]]></category>          <category tid="42921"><![CDATA[Exhibitions]]></category>          <category tid="42891"><![CDATA[Georgia Tech Arts]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>      </categories>  <news_terms>          <term tid="42901"><![CDATA[Community]]></term>          <term tid="42911"><![CDATA[Education]]></term>          <term tid="42921"><![CDATA[Exhibitions]]></term>          <term tid="42891"><![CDATA[Georgia Tech Arts]]></term>          <term tid="148"><![CDATA[Music and Music Technology]]></term>      </news_terms>  <keywords>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="4299"><![CDATA[middle school]]></keyword>          <keyword tid="193070"><![CDATA[AI education]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="675081">  <title><![CDATA[Mandarin Remix: Georgia Tech Researcher Finds Chinese Rap is Defying Linguistic Tradition]]></title>  <uid>35766</uid>  <body><![CDATA[<div><p><a href="https://sites.gatech.edu/liu/" rel="noreferrer noopener" target="_blank">Jin Liu,</a> an associate professor of Chinese in the School of Modern Languages, created an algorithm to analyze tone use in Chinese rap songs. She also worked with students in the College of Computing to integrate linguistics and cultural studies with computer science and statistics.&nbsp;</p></div><div><h2>Why it matters&nbsp;&nbsp;</h2></div><div><p>"Technology provides empirical and quantitative evidence or data to verify human intuitive perceptions of art," Liu explained. "This is an interdisciplinary project that extends beyond the range of any single researcher's knowledge and expertise,"&nbsp;</p></div><div><p>Her co-authors on the paper included Amanda She (CS 2023), Haosong Ma (MS CS 2022), Jiahong Yuan, a professor at the University of Science and Technology of China, and Hongyuan Dong, an associate professor of Chinese language and linguistics at George Washington University.&nbsp;</p></div><div><p>"My field is linguistics and cultural studies, and we needed to collaborate with people in computer science to develop an algorithm and write a program to compute the Tonal Congruence Index," Liu said. "People in computer science need our expertise in linguistics and computational phonetics to develop different rules and criteria to distinguish the tones and learn the related pitch software."&nbsp;</p></div><div><h2>More About the Study&nbsp;</h2></div><div><p>Unlike English, Mandarin Chinese is a tonal language, meaning using the same word with different pitches can change the meaning of the words.&nbsp;</p></div><div><p>In Liu's study, she found that while Chinese rappers used standard tones in the earlier rap styles, artists following more recent trends — such as trap music and mumble rap — are more likely to change or ignore Chinese tones to better fit the hip hop style in their work.&nbsp;</p></div><div><p>Liu and her co-authors also report that Chinese rappers now use more English words in their work to achieve musical flow.&nbsp;</p></div><div><p>"As hip-hop music continues to diversify, the correlation between tone and rap gradually decreases," Liu said.&nbsp;</p></div><div><h2>What's Next&nbsp;</h2></div><div><p>The researcher's next step is to develop an open-source tool to further automate the audio-processing pipeline.&nbsp;</p></div><div><p>Liu's work also has educational applications. For example, she recently presented "AI and Tones in Chinese Songs" at a symposium on education in the age of artificial intelligence.&nbsp;&nbsp;</p></div><div><p>"Language instructors should teach songs with a high score on the Tonal Congruence Index because tonal shapes and contrasts are better accommodated in them," she said.&nbsp;</p></div><div><p><em>This research is partially supported by the Ivan Allen College of Liberal Arts Small Grants for Research funding at Georgia Tech. </em><a href="https://www.tandfonline.com/doi/full/10.1080/09298215.2024.2329075" rel="noreferrer noopener" target="_blank"><em>"Linguistic Tone in Chinese Rap: An Interdisciplinary Approach"</em></a><em> was published in the </em>Journal of New Music<em> in March 2024. It is available at: </em><a href="https://www.tandfonline.com/doi/full/10.1080/09298215.2024.2329075" rel="noreferrer noopener" target="_blank"><em>https://www.tandfonline.com/doi/full/10.1080/09298215.2024.2329075</em></a>&nbsp;</p></div>]]></body>  <author>dminardi3</author>  <status>1</status>  <created>1718123559</created>  <gmt_created>2024-06-11 16:32:39</gmt_created>  <changed>1718373361</changed>  <gmt_changed>2024-06-14 13:56:01</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Chinese rap is straying further from standard Mandarin and using more English, according to Associate Professor Jin Liu in the School of Modern Languages. She worked with students in the College of Computing to create tools to uncover the trends.]]></teaser>  <type>news</type>  <sentence><![CDATA[Chinese rap is straying further from standard Mandarin and using more English, according to Associate Professor Jin Liu in the School of Modern Languages. She worked with students in the College of Computing to create tools to uncover the trends.]]></sentence>  <summary><![CDATA[<p>Chinese rap is straying further from standard Mandarin and using more English, according to Associate Professor Jin Liu in the School of Modern Languages. She worked with students in the College of Computing to create computational tools to uncover the trends.&nbsp;</p>]]></summary>  <dateline>2024-06-11T00:00:00-04:00</dateline>  <iso_dateline>2024-06-11T00:00:00-04:00</iso_dateline>  <gmt_dateline>2024-06-11 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[dminardi3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:dminardi3@gatech.edu">Di Minardi</a><br>Ivan Allen College of Liberal Arts</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>674169</item>      </media>  <hg_media>          <item>          <nid>674169</nid>          <type>image</type>          <title><![CDATA[ChineseRap.png]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[1600 x 900.png]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/06/11/1600%20x%20900.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/06/11/1600%20x%20900.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/06/11/1600%2520x%2520900.png?itok=Lk1mq1gB]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Image of a performer from behind, singing on stage in front of a crowd.]]></image_alt>                    <created>1718123707</created>          <gmt_created>2024-06-11 16:35:07</gmt_created>          <changed>1718123707</changed>          <gmt_changed>2024-06-11 16:35:07</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1281"><![CDATA[Ivan Allen College of Liberal Arts]]></group>          <group id="1284"><![CDATA[School of Modern Languages]]></group>      </groups>  <categories>          <category tid="148"><![CDATA[Music and Music Technology]]></category>      </categories>  <news_terms>          <term tid="148"><![CDATA[Music and Music Technology]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node></nodes>