<node id="681671">
  <nid>681671</nid>
  <type>news</type>
  <uid>
    <user id="36734"><![CDATA[36734]]></user>
  </uid>
  <created>1744137049</created>
  <changed>1745336273</changed>
  <title><![CDATA[Faculty, Students Pilot AI Crisis Simulation]]></title>
  <body><![CDATA[<div><div><p>Researchers from Georgia Tech and the Georgia Tech Research Institute (<a href="https://gtri.gatech.edu">GTRI</a>) recently piloted an in-depth crisis simulation exploring the national security implications of advanced artificial intelligence. Designed by the <a href="https://www.aisi.dev/" rel="noreferrer noopener" target="_blank">AI Safety Initiative</a> in collaboration with <a href="https://gtmun.gatech.edu/" rel="noreferrer noopener" target="_blank">Model UN at Georgia Tech</a>, the immersive half-day workshop challenged faculty to respond to a series of escalating threats — including a potential biological attack, cyberattacks, and rising global tensions.&nbsp;</p></div><div><p>Participants represented major governments, corporations, and organizations — including OpenAI and Google DeepMind — and were inundated with simulated press releases and intelligence reports describing the rapid evolution of AI technologies. Their task: to debate and coordinate policy responses in real time.&nbsp;</p></div><div><p>In one scenario, a preliminary World Health Organization report revealed AI-enabled pathogens spreading across Central Asia. The player representing China quickly moved to close borders and reimpose pandemic-era lockdowns, a move that caused global confusion and economic instability.&nbsp;</p></div><div><p>“There’s just no way I could have predicted that response,” said Parv Mahajan, the director of the simulation. “But that kind of extreme response tells us so much about how unprepared countries might react.”&nbsp;</p></div><div><p>Divjot Kaur, who constructed the simulated documents participants received throughout the workshop, agreed. “This valuable information can shed light on the research and work we must put in,” she said.&nbsp;</p></div><div><p>Some players took advantage of the chaos. The simulation concluded with a discussion about how profit motives might distort information access and accelerate a potential AI arms race.&nbsp;</p></div><div><p>What stood out most to participants was the range of ideas that emerged during the crisis. “It was great to see the perspectives of diverse disciplines on the future of AI,” said Amaar Alidina, an undergraduate researcher. “Debate provided meaningful insight on topics we wouldn't even have thought of,” Kaur said. &nbsp;</p></div><div><p>Looking ahead, the AI Safety Initiative hopes to expand the simulation through collaborations with labs and departments across campus.&nbsp;&nbsp;</p></div><div><p>“The future of our work will depend, in some way or another, on AI," said Mahajan. "And the best way to understand the future is to try and experience it.”</p></div></div>]]></body>
  <field_subtitle>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_subtitle>
  <field_dateline>
    <item>
      <value>2025-04-07T00:00:00-04:00</value>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_dateline>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Researchers explore national security risks posed by advanced AI through a high-stakes strategic exercise.]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>In a simulation from Georgia Tech and GTRI, participants navigated escalating global crises — including AI-enabled biothreats and cyberattacks — to assess how different actors might respond to emerging AI risks.</p>]]></value>
    </item>
  </field_summary>
  <field_media>
          <item>
        <nid>
          <node id="676793">
            <nid>676793</nid>
            <type>image</type>
            <title><![CDATA[DSC04327.jpg]]></title>
            <body><![CDATA[]]></body>
                          <field_image>
                <item>
                  <fid>260634</fid>
                  <filename><![CDATA[DSC04327.jpg]]></filename>
                  <filepath><![CDATA[/sites/default/files/2025/04/08/DSC04327_0.jpg]]></filepath>
                  <file_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/04/08/DSC04327_0.jpg]]></file_full_path>
                  <filemime>image/jpeg</filemime>
                  <image_740><![CDATA[]]></image_740>
                  <image_alt><![CDATA[Man with OpenAI placard listens carefully to speech.]]></image_alt>
                </item>
              </field_image>
            
                      </node>
        </nid>
      </item>
          <item>
        <nid>
          <node id="676794">
            <nid>676794</nid>
            <type>image</type>
            <title><![CDATA[DSC04279.jpg]]></title>
            <body><![CDATA[]]></body>
                          <field_image>
                <item>
                  <fid>260635</fid>
                  <filename><![CDATA[DSC04279.jpg]]></filename>
                  <filepath><![CDATA[/sites/default/files/2025/04/08/DSC04279_0.jpg]]></filepath>
                  <file_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/04/08/DSC04279_0.jpg]]></file_full_path>
                  <filemime>image/jpeg</filemime>
                  <image_740><![CDATA[]]></image_740>
                  <image_alt><![CDATA[Man with "Other Researchers and the Press" placard studies documents.]]></image_alt>
                </item>
              </field_image>
            
                      </node>
        </nid>
      </item>
      </field_media>
  <field_contact_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_contact_email>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_contact>
    <item>
      <value><![CDATA[<p dir="ltr">AI Safety Initiative<br><a href="mailto:board@aisi.dev">board@aisi.dev</a></p><p dir="ltr">Georgia Tech Model UN<br><a href="mailto:gatechmun@gmail.com">gatechmun@gmail.com</a></p>]]></value>
    </item>
  </field_contact>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <!--  TO DO: correct to not conflate categories and news room topics  -->
  <!--  Disquisition: it's funny how I write these TODOs and then never
         revisit them. It's as though the act of writing the thing down frees me
         from the responsibility to actually solve the problem. But what can I
         say? There are more problems than there's time to solve.  -->
  <links_related> </links_related>
  <files> </files>
  <og_groups>
          <item>660394</item>
          <item>1214</item>
      </og_groups>
  <og_groups_both>
          <item>
        <![CDATA[Biotechnology, Health, Bioengineering, Genetics]]>
      </item>
          <item>
        <![CDATA[Computer Science/Information Technology and Security]]>
      </item>
          <item>
        <![CDATA[Economic Development and Policy]]>
      </item>
          <item>
        <![CDATA[Military Technology]]>
      </item>
          <item>
        <![CDATA[Policy, Social Sciences, and Liberal Arts]]>
      </item>
          <item>
        <![CDATA[Research]]>
      </item>
          <item>
        <![CDATA[Student and Faculty]]>
      </item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>138</tid>
        <value><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></value>
      </item>
          <item>
        <tid>153</tid>
        <value><![CDATA[Computer Science/Information Technology and Security]]></value>
      </item>
          <item>
        <tid>131</tid>
        <value><![CDATA[Economic Development and Policy]]></value>
      </item>
          <item>
        <tid>147</tid>
        <value><![CDATA[Military Technology]]></value>
      </item>
          <item>
        <tid>151</tid>
        <value><![CDATA[Policy, Social Sciences, and Liberal Arts]]></value>
      </item>
          <item>
        <tid>135</tid>
        <value><![CDATA[Research]]></value>
      </item>
          <item>
        <tid>134</tid>
        <value><![CDATA[Student and Faculty]]></value>
      </item>
      </field_categories>
  <core_research_areas>
          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>
          <term tid="145171"><![CDATA[Cybersecurity]]></term>
          <term tid="193653"><![CDATA[Georgia Tech Research Institute]]></term>
          <term tid="39481"><![CDATA[National Security]]></term>
          <term tid="39501"><![CDATA[People and Technology]]></term>
      </core_research_areas>
  <field_news_room_topics>
          <item>
        <tid>71871</tid>
        <value><![CDATA[Campus and Community]]></value>
      </item>
      </field_news_room_topics>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>660394</item>
          <item>1214</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[AI Safety Initative (AISI)]]></item>
          <item><![CDATA[News Room]]></item>
      </og_groups_both>
  <field_keywords>
          <item>
        <tid>194465</tid>
        <value><![CDATA[AI Safety]]></value>
      </item>
          <item>
        <tid>2835</tid>
        <value><![CDATA[ai]]></value>
      </item>
          <item>
        <tid>187812</tid>
        <value><![CDATA[artificial intelligence (AI)]]></value>
      </item>
          <item>
        <tid>184285</tid>
        <value><![CDATA[Georgia Tech Ivan Allen College of Liberal Arts; school of public policy]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
