<node id="683686">
  <nid>683686</nid>
  <type>news</type>
  <uid>
    <user id="27863"><![CDATA[27863]]></user>
  </uid>
  <created>1754681755</created>
  <changed>1755008137</changed>
  <title><![CDATA[Research Combining Humans, Robots, and Unicycles Receives NSF Award]]></title>
  <body><![CDATA[<p>Research into tailored assistive and rehabilitative devices has seen recent advancements but the goal remains out of reach due to the sparsity of data on how humans learn complex balance tasks. To address this gap, a collaborating team of interdisciplinary faculty from Florida State University and Georgia Tech have been awarded ~$798,000 by the NSF to launch a study to better understand human motor learning as well as gain greater understanding into human robot interaction dynamics during the learning process.</p><p>&nbsp;Led by PI:&nbsp;<a href="https://rthmlab.wixsite.com/taylorgambon">Taylor Higgins</a>, Assistant Professor, FAMU-FSU Department of Mechanical Engineering, partnering with Co-PIs&nbsp;<a href="https://www.shreyaskousik.com/">Shreyas Kousik</a>, Assistant Professor, Georgia Tech, George W. Woodruff School of Mechanical Engineering, and&nbsp;<a href="https://annescollege.fsu.edu/faculty-staff/dr-brady-decouto">Brady DeCouto,</a> Assistant Professor, FSU&nbsp;Anne Spencer Daves College of Education, Health, and Human Sciences, the research will use the acquisition of unicycle riding skill by participants to gain a better grasp on human motor learning in tasks requiring balance and complex movement in space. Although it might sound a bit odd, the fact that most people don’t know how to ride a unicycle, and the fact that it requires balance, mean that the data will cover the learning process from novice to skilled across the participant pool.</p><p>Using data acquired from human participants, the team will develop a “robotics assistive unicycle” that will be used in the training of the next pool of novice unicycle riders. &nbsp;This is to gauge if, and how rapidly, human motor learning outcomes improve with the assistive unicycle. The participants that engage with the robotic unicycle will also give valuable insight into developing effective human-robot collaboration strategies.</p><p>The fact that deciding to get on a unicycle requires a bit of bravery might not be great for the participants, but it’s great for the research team. The project will also allow exploration into the interconnection between anxiety and human motor learning to discover possible alleviation strategies, thus increasing the likelihood of positive outcomes for future patients and consumers of these devices.</p><p>&nbsp;</p><p>Author<br>-Christa M. Ernst</p><p>This Article Refers to NSF Award # 2449160</p>]]></body>
  <field_subtitle>
    <item>
      <value><![CDATA[Trio from Florida State University and Georgia Tech aim to develop better assistive and rehabilitative technologies and strategies using novel approach.]]></value>
    </item>
  </field_subtitle>
  <field_dateline>
    <item>
      <value>2025-08-08T00:00:00-04:00</value>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_dateline>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Novel research to improve tailored assistive and rehabilitative devices wins NSF Grant]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>A collaborating team of interdisciplinary faculty from Florida State University and Georgia Tech have been awarded ~$798,000 by the NSF to launch a study to better understand human motor learning as well as gain greater understanding into human robot interaction dynamics during the learning process.</p>]]></value>
    </item>
  </field_summary>
  <field_media>
          <item>
        <nid>
          <node id="677632">
            <nid>677632</nid>
            <type>image</type>
            <title><![CDATA[Kousik-NSF-Award-News-Graphic.png]]></title>
            <body><![CDATA[]]></body>
                          <field_image>
                <item>
                  <fid>261548</fid>
                  <filename><![CDATA[Kousik-NSF-Award-News-Graphic.png]]></filename>
                  <filepath><![CDATA[/sites/default/files/2025/08/08/Kousik-NSF-Award-News-Graphic.png]]></filepath>
                  <file_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/08/08/Kousik-NSF-Award-News-Graphic.png]]></file_full_path>
                  <filemime>image/png</filemime>
                  <image_740><![CDATA[]]></image_740>
                  <image_alt><![CDATA[Graphic of person using an assistive device thinking about how a robot could hep learn riding a unicycle]]></image_alt>
                </item>
              </field_image>
            
                      </node>
        </nid>
      </item>
      </field_media>
  <field_contact_email>
    <item>
      <email><![CDATA[christa.ernst@research.gatech.edu]]></email>
    </item>
  </field_contact_email>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_contact>
    <item>
      <value><![CDATA[<div><strong>Christa M. Ernst</strong></div><div>Research Communications Program Manager</div><div>Klaus Advance Computing Building 1120E | 266 Ferst Drive | Atlanta GA | 30332</div><div><strong>Topic Expertise: Robotics | Data Sciences | Semiconductor Design &amp; Fab</strong></div><div>christa.ernst@research.gatech.edu</div>]]></value>
    </item>
  </field_contact>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <!--  TO DO: correct to not conflate categories and news room topics  -->
  <!--  Disquisition: it's funny how I write these TODOs and then never
         revisit them. It's as though the act of writing the thing down frees me
         from the responsibility to actually solve the problem. But what can I
         say? There are more problems than there's time to solve.  -->
  <links_related> </links_related>
  <files> </files>
  <og_groups>
          <item>545781</item>
          <item>142761</item>
          <item>1188</item>
      </og_groups>
  <og_groups_both>
          <item>
        <![CDATA[Artificial Intelligence]]>
      </item>
          <item>
        <![CDATA[Biotechnology, Health, Bioengineering, Genetics]]>
      </item>
          <item>
        <![CDATA[Engineering]]>
      </item>
          <item>
        <![CDATA[Research]]>
      </item>
          <item>
        <![CDATA[Robotics]]>
      </item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>194606</tid>
        <value><![CDATA[Artificial Intelligence]]></value>
      </item>
          <item>
        <tid>138</tid>
        <value><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></value>
      </item>
          <item>
        <tid>145</tid>
        <value><![CDATA[Engineering]]></value>
      </item>
          <item>
        <tid>135</tid>
        <value><![CDATA[Research]]></value>
      </item>
          <item>
        <tid>152</tid>
        <value><![CDATA[Robotics]]></value>
      </item>
      </field_categories>
  <core_research_areas>
          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>
          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>
          <term tid="193656"><![CDATA[Neuro Next Initiative]]></term>
          <term tid="39521"><![CDATA[Robotics]]></term>
      </core_research_areas>
  <field_news_room_topics>
      </field_news_room_topics>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>545781</item>
          <item>142761</item>
          <item>1188</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Institute for Data Engineering and Science]]></item>
          <item><![CDATA[IRIM]]></item>
          <item><![CDATA[Research Horizons]]></item>
      </og_groups_both>
  <field_keywords>
          <item>
        <tid>78841</tid>
        <value><![CDATA[human-robot interaction]]></value>
      </item>
          <item>
        <tid>5525</tid>
        <value><![CDATA[assistive technologies]]></value>
      </item>
          <item>
        <tid>187915</tid>
        <value><![CDATA[go-researchnews]]></value>
      </item>
          <item>
        <tid>187582</tid>
        <value><![CDATA[go-ibb]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
