<nodes> <node id="688391">  <title><![CDATA[Robot Pollinator Could Produce More, Better Crops for Indoor Farms]]></title>  <uid>36530</uid>  <body><![CDATA[<p>A new robot could solve one of the biggest challenges facing indoor farmers: manual pollination.</p><p>Indoor farms, also known as vertical farms, are popular among agricultural researchers and are expanding across the agricultural industry. Some benefits they have over outdoor farms include:</p><ul><li>Year-round production of food crops</li><li>Less water and land requirements</li><li>Not needing pesticides</li><li>Reducing carbon emissions from shipping</li><li>Reducing food waste</li></ul><p>Additionally,&nbsp;<a href="https://www.agritecture.com/blog/2021/7/20/5-ways-vertical-farming-is-improving-nutrition"><strong>some studies</strong></a> indicate that indoor farms produce more nutritious food for urban communities.&nbsp;</p><p>However, these farms are often inaccessible to birds, bees, and other natural pollinators, leaving the pollination process to humans. The tedious process must be completed by hand for each flower to ensure the indoor crop flourishes.</p><p><a href="https://research.gatech.edu/people/ai-ping-hu"><strong>Ai-Ping Hu</strong></a>, a principal research engineer at the Georgia Tech Research Institute (GTRI), has spent years exploring methods to efficiently pollinate flowering plants and food crops in indoor farms to find a way to efficiently pollinate flower plants and food crops in indoor farms.</p><p>Hu,&nbsp;<a href="https://research.gatech.edu/people/shreyas-kousik"><strong>Assistant Professor Shreyas Kousik of the George W. Woodruff School of Mechanical Engineering</strong></a>, and a rotating group of student interns have developed a robot prototype that may be up to the task.</p><p>The robot can efficiently pollinate plants that have both male and female reproductive parts. These plants only require pollen to be transferred from one part to the other rather than externally from another flower.</p><p>Natural pollinators perform this task outdoors, but Hu said indoor farmers often use a paintbrush or electric tootbrush to ensure these flowers are pollinated.&nbsp;</p><h4><strong>Knowing the Pose</strong></h4><p>An early challenge the research team addressed was teaching the robot to identify the “pose” of each flower. Pose refers to a flower’s orientation, shape, and symmetry. Knowing these details ensures precise delivery of the pollen to maximize reproductive success.&nbsp;</p><p>“It’s crucial to know exactly which way the flowers are facing,” Hu said.</p><p>“You want to approach the flower from the front because that’s where all the biological structures are. Knowing the pose tells you where the stem is. Our device grasps the stem and shakes it to dislodge the pollen.</p><p>“Every flower is going to have its own pose, and you need to know what that is within at least 10 degrees.”</p><h4><strong>Computer Vision Breakthrough</strong></h4><p><strong>Harsh Muriki</strong> is a robotics master’s student at Georgia Tech’s School of Interactive Computing, who used computer vision to solve the pose problem while interning for Hu and GTRI.</p><p>Muriki attached a camera to a FarmBot to capture images of strawberry plants from dozens of angles in a small garden in front of Georgia Tech’s Food Processing Technology Building. The&nbsp;<a href="https://farm.bot/?srsltid=AfmBOoqh1Z8vSs3WflZisgw5DsOUSo8shD4VtY0Y8_VmVpVyt0Iwalxo"><strong>FarmBot</strong></a> is an XYZ-axis robot that waters and sprays pesticides on outdoor gardens, though it is not capable of pollination.</p><p>“We reconstruct the images of the flower into a 3D model and use a technique that converts the 3D model into multiple 2D images with depth information,” Muriki said. “This enables us to send them to object detectors.”</p><p>Muriki said he used a real-time object detection system called YOLO (You Only Look Once) to classify objects. YOLO is known for identifying and classifying objects in a single pass.</p><p><strong>Ved Sengupta</strong>, a computer engineering major who interned with Muriki, fine-tuned the algorithms that converted 3D images into 2D.</p><p>“This was a crucial part of making robot pollination possible,” Sengupta said. “There is a big gap between 3D and 2D image processing.</p><p>“There’s not a lot of data on the internet for 3D object detection, but there’s a ton for 2D. We were able to get great results from the converted images, and I think any sector of technology can take advantage of that.”</p><p>Sengupta, Muriki, and Hu co-authored a paper about their work that was accepted to the 2025 International Conference on Robotics and Automation (ICRA) in Atlanta.</p><h4><strong>Measuring Success</strong></h4><p>The pollination robot, built in Kousik’s Safe Robotics Lab, is now in the prototype phase.&nbsp;</p><p>Hu said the robot can do more than pollinate. It can also analyze each flower to determine how well it was pollinated and whether the chances for reproduction are high.</p><p>“It has an additional capability of microscopic inspection,” Hu said. “It’s the first device we know of that provides visual feedback on how well a flower was pollinated.”</p><p>For more information about the robot, visit the&nbsp;<a href="https://saferoboticslab.me.gatech.edu/research/towards-robotic-pollination/"><strong>Safe Robotics Lab project page</strong></a>.</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1771527492</created>  <gmt_created>2026-02-19 18:58:12</gmt_created>  <changed>1774011241</changed>  <gmt_changed>2026-03-20 12:54:01</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A research team that expands GTRI, the College of Engineering, and the College of Computing have developed a robot capable of pollinating flowers in indoor farms.]]></teaser>  <type>news</type>  <sentence><![CDATA[A research team that expands GTRI, the College of Engineering, and the College of Computing have developed a robot capable of pollinating flowers in indoor farms.]]></sentence>  <summary><![CDATA[<p>Manual pollination is one of the biggest challenges for indoor farmers. These farms are often inaccessible to birds, bees, and other natural pollinators, leaving the pollination process to humans. The tedious process must be completed by hand for each flower to ensure the indoor crop flourishes.</p><p>A Georgia Tech research led by Ai-Ping Hu and Shreyas Kousik team is working to solve that. A robot they've developed can efficiently pollinate plants that have both male and female reproductive parts. These plants only require pollen to be transferred from one part to the other rather than externally from another flower.</p>]]></summary>  <dateline>2026-02-19T00:00:00-05:00</dateline>  <iso_dateline>2026-02-19T00:00:00-05:00</iso_dateline>  <gmt_dateline>2026-02-19 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:ndeen6@gatech.edu">Nathan Deen</a><br>College of Computing<br>Georgia Tech</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679370</item>      </media>  <hg_media>          <item>          <nid>679370</nid>          <type>image</type>          <title><![CDATA[Harsh-Muriki_86A0006.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Harsh-Muriki_86A0006.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/02/19/Harsh-Muriki_86A0006.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/02/19/Harsh-Muriki_86A0006.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/02/19/Harsh-Muriki_86A0006.jpg?itok=WJg8YQi9]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Harsh Muriki]]></image_alt>                    <created>1771527500</created>          <gmt_created>2026-02-19 18:58:20</gmt_created>          <changed>1771527500</changed>          <gmt_changed>2026-02-19 18:58:20</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187991"><![CDATA[go-robotics]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="11506"><![CDATA[computer vision]]></keyword>          <keyword tid="180840"><![CDATA[computer vision systems]]></keyword>          <keyword tid="669"><![CDATA[agriculture]]></keyword>          <keyword tid="194392"><![CDATA[AI in Agriculture]]></keyword>          <keyword tid="170254"><![CDATA[urban gardening]]></keyword>          <keyword tid="94111"><![CDATA[farming]]></keyword>          <keyword tid="14913"><![CDATA[urban farming]]></keyword>          <keyword tid="23911"><![CDATA[bees]]></keyword>          <keyword tid="6660"><![CDATA[flowers]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="193653"><![CDATA[Georgia Tech Research Institute]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71911"><![CDATA[Earth and Environment]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="688893">  <title><![CDATA[Sheepdogs Reveal a Better Way to Guide Robot Swarms]]></title>  <uid>27271</uid>  <body><![CDATA[<p>Sheepdogs, bred to control large groups of sheep in open fields, have demonstrated their skills in competitions dating back to the 1870s.</p><p>In these contests, a handler directs a trained dog with whistle signals to guide a small group of sheep across a field and sometimes split the flock cleanly into two groups. But sheep do not always cooperate.</p><p>Researchers at the Georgia Institute of Technology studied how handler–dog teams manage these unpredictable flocks in sheepdog trials and found principles that extend beyond livestock herding.</p><p>In a <a href="https://www.science.org/doi/10.1126/sciadv.adx6791"><strong>study</strong></a> published in <em>Science Advances&nbsp;</em>as the cover feature, the researchers applied those insights to computer simulations showing how similar strategies could improve the control of robot swarms, autonomous vehicles, AI agents, and other networked systems where many machines must coordinate their actions despite uncertain conditions.</p><p><strong>Group Movement Dynamics</strong></p><p>“Birds, bugs, fish, sheep, and many other organisms move in groups because it benefits individuals, including protection from predators,” said <a href="https://bhamla.gatech.edu/"><strong>Saad Bhamla</strong></a>, an associate professor in Georgia Tech’s School of Chemical and Biomolecular Engineering. “The puzzle is that the ‘group’ is not a single organism. It is built from many individuals, each making local, imperfect decisions.”</p><p>When a predator threatens a herd of sheep, individuals near the edge often move toward the center to reduce their own risk, Bhamla explained. “This is ‘selfish herd’ behavior,” he said. “Shepherds exploit that instinct using trained dogs.”</p><p>From examining hours of contest footage, the researchers found that controlling small groups of sheep can be harder than managing large ones. A larger group, with more sheep protected in the center, may behave more coherently than a small group as the animals constantly shift between two instincts: “follow the group” and “flee the dog.”</p><p>“That switching behavior makes the group unpredictable,” said Tuhin Chakrabortty, a former postdoctoral researcher in the Bhamla Lab who co-led the study.</p><p>Looking closely at how dogs and their handlers guide small groups, the researchers found that unpredictability in the flock’s behavior does not always make control harder. “Under the right conditions, that ‘noisy’ behavior might actually be a benefit,” Bhamla said.</p><p><strong>Successful Sheep Herding</strong></p><p>Sheepdog handlers categorize sheep by how strongly they respond to a dog’s threatening pressure. Some very responsive sheep might panic under too much pressure, while others might ignore mild pressure and require stronger positioning by the dog.</p><p>The researchers observed that successful control often followed a two-step pattern. First, the dog subtly influenced the sheep’s orientation while the animals were mostly standing still. Once the flock was aligned in the desired direction, the dog increased pressure to trigger movement. The timing of those actions was critical, because alignment within a small group could disappear quickly as individuals switched between instincts.</p><p>“In our simulations, increasing pressure makes the flock reach the desired orientation faster, but how long the flock stays aligned is set mainly by noise,” Chakrabortty said. “In essence, dogs can steer the direction, but they can’t hold that decision indefinitely, so timing matters.”</p><div><div><div><div><div><p><strong>Developing Computer Models</strong></p><p>To understand the broader implications of that behavior, the team developed computer models that captured how sheep respond both to the dog and to one another. The models allowed the researchers to test different strategies for guiding groups whose members make independent decisions under uncertainty.</p><p>They then applied those ideas to simulations of robotic swarms. Engineers often design such systems so that each robot blends signals from all nearby robots before deciding how to move. While that approach works well when signals are clear, it can break down when information is noisy or conflicting, Bhamla explained.</p></div></div></div></div></div><div><div><div><div><div><p>To explain why that switching strategy can work under noisy conditions, the researchers used an analogy of a smoke-filled room where only one person can see the exit, and no one knows who that person is. If everyone polls everyone else and averages the guesses, the one correct signal can get diluted by many noisy ones.</p><p>“That’s the counterintuitive part. When only one person has the right information, averaging can wash out the signal. But if you follow one person at a time, and keep switching who that is, the right information can spread through the crowd,” Bhamla said.</p><p>Building on that idea, the researchers tested a strategy inspired by the switching behavior they observed in sheep. In the simulations, each robot paid attention to just one source at a time (either a guiding signal or a neighboring robot) and switched that source from one step to the next.</p><p>Under noisy conditions, this switching strategy required less effort to keep the group moving along a desired path than either averaging-based strategies or fixed leader-follower strategies.</p><p>The researchers call their approach the Indecisive Swarm Algorithm. The name reflects a counterintuitive insight: allowing influence to shift among individuals over time can make groups easier to guide when conditions are uncertain.</p><p>“Our findings suggest that the same dynamics that make small animal groups unpredictable may also offer new ways to control complex engineered systems,” Bhamla said.</p><p>CITATION: Tuhin Chakrabortty and Saad Bhamla, “<a href="https://www.science.org/doi/10.1126/sciadv.adx6791"><strong>Controlling noisy herds: Temporal network restructuring improves control of indecisive collectives</strong></a>,” <em>Science Advances</em>, 2026</p><p><em>This research was funded in part by Schmidt Sciences as part of a </em><a href="https://news.gatech.edu/news/2025/09/16/saad-bhamla-named-2025-schmidt-polymath"><em>Schmidt Polymath</em></a><em> grant to Saad Bhamla.</em></p></div></div></div></div></div>]]></body>  <author>Brad Dixon</author>  <status>1</status>  <created>1773259186</created>  <gmt_created>2026-03-11 19:59:46</gmt_created>  <changed>1773330805</changed>  <gmt_changed>2026-03-12 15:53:25</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers studying sheepdog trials found new principles for guiding unpredictable groups and used them to develop computer models that could improve coordination in robot swarms, autonomous vehicles, and other networked systems.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers studying sheepdog trials found new principles for guiding unpredictable groups and used them to develop computer models that could improve coordination in robot swarms, autonomous vehicles, and other networked systems.]]></sentence>  <summary><![CDATA[<p>Georgia Tech researchers studying sheepdog trials found new principles for guiding unpredictable groups and used them to develop computer models that could improve coordination in robot swarms, autonomous vehicles, and other networked systems.</p>]]></summary>  <dateline>2026-03-11T00:00:00-04:00</dateline>  <iso_dateline>2026-03-11T00:00:00-04:00</iso_dateline>  <gmt_dateline>2026-03-11 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[braddixon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Brad Dixon, <a href="mailto: braddixon@gatech.edu">braddixon@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>679589</item>          <item>679590</item>          <item>679591</item>          <item>679584</item>          <item>679588</item>      </media>  <hg_media>          <item>          <nid>679589</nid>          <type>video</type>          <title><![CDATA[SMART Dogs herding sheep on a farm, looks like flock of bird pattern]]></title>          <body><![CDATA[<p>SMART Dogs herding sheep on a farm, looks like flock of bird pattern</p>]]></body>                      <youtube_id><![CDATA[_CjwqIX6C2I]]></youtube_id>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>            <vimeo_id><![CDATA[]]></vimeo_id>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>            <video_url><![CDATA[https://youtu.be/_CjwqIX6C2I?si=bfsxIT77-iAJCm-2]]></video_url>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>                    <created>1773260200</created>          <gmt_created>2026-03-11 20:16:40</gmt_created>          <changed>1773260200</changed>          <gmt_changed>2026-03-11 20:16:40</gmt_changed>      </item>          <item>          <nid>679590</nid>          <type>video</type>          <title><![CDATA[A dog herding sheep in a sheepdog trial]]></title>          <body><![CDATA[<p><em>A dog herding sheep in a sheepdog trial</em></p>]]></body>                      <youtube_id><![CDATA[cnPOXfUC8rc]]></youtube_id>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>            <vimeo_id><![CDATA[]]></vimeo_id>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>            <video_url><![CDATA[https://youtu.be/cnPOXfUC8rc?si=41jH8u3UQ_qjgqWn]]></video_url>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>                    <created>1773260676</created>          <gmt_created>2026-03-11 20:24:36</gmt_created>          <changed>1773260676</changed>          <gmt_changed>2026-03-11 20:24:36</gmt_changed>      </item>          <item>          <nid>679591</nid>          <type>video</type>          <title><![CDATA[ Controlling 'Noisy' Sheep Herds]]></title>          <body><![CDATA[<p>Controlling 'noisy' sheep herds</p>]]></body>                      <youtube_id><![CDATA[EMHmDPpe8HE]]></youtube_id>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>            <vimeo_id><![CDATA[]]></vimeo_id>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>            <video_url><![CDATA[https://youtu.be/EMHmDPpe8HE?si=_5DFsk_BafsIK78R]]></video_url>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>                    <created>1773260974</created>          <gmt_created>2026-03-11 20:29:34</gmt_created>          <changed>1773260974</changed>          <gmt_changed>2026-03-11 20:29:34</gmt_changed>      </item>          <item>          <nid>679584</nid>          <type>image</type>          <title><![CDATA[Sheepdog herding sheep]]></title>          <body><![CDATA[<p>Sheepdog herding in a sheepdog trial competition</p>]]></body>                      <image_name><![CDATA[sheepdog1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/11/sheepdog1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/11/sheepdog1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/11/sheepdog1.jpg?itok=kTQiLGXI]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sheepdog herding sheep]]></image_alt>                    <created>1773259589</created>          <gmt_created>2026-03-11 20:06:29</gmt_created>          <changed>1773261394</changed>          <gmt_changed>2026-03-11 20:36:34</gmt_changed>      </item>          <item>          <nid>679588</nid>          <type>image</type>          <title><![CDATA[Sheeping herding resistant sheep]]></title>          <body><![CDATA[<p>Sheepdogs first align the flock’s direction, then apply pressure to trigger movement before the sheep lose alignment.</p>]]></body>                      <image_name><![CDATA[sheepdog2-copy.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2026/03/11/sheepdog2-copy.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2026/03/11/sheepdog2-copy.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2026/03/11/sheepdog2-copy.jpg?itok=5CXyEB8U]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sheepdog herding seep]]></image_alt>                    <created>1773259967</created>          <gmt_created>2026-03-11 20:12:47</gmt_created>          <changed>1773261607</changed>          <gmt_changed>2026-03-11 20:40:07</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="1240"><![CDATA[School of Chemical and Biomolecular Engineering]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="194958"><![CDATA[Sheepdogs]]></keyword>          <keyword tid="194959"><![CDATA[Herding]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="686422">  <title><![CDATA[Ph.D. Student’s Framework Used to Bolster Nvidia’s Cosmos Predict-2 Model]]></title>  <uid>36530</uid>  <body><![CDATA[<p>A new deep learning architectural framework could boost the development and deployment efficiency of autonomous vehicles and humanoid robots. The framework will lower training costs and reduce the amount of real-world data needed for training.</p><p>World foundation models (WFMs) enable physical AI systems to learn and operate within&nbsp;synthetic worlds created by generative artificial intelligence (genAI). For example, these models use predictive capabilities to generate up to 30 seconds of video that accurately reflects the real world.</p><p>The new framework, developed by a Georgia Tech researcher, enhances the processing speed of the neural networks that simulate these real-world environments from text, images, or video inputs.</p><p>The neural networks that make up the architectures of large language models like ChatGPT and visual models like Sora process contextual information using the “attention mechanism.”</p><p>Attention refers to a model’s ability to focus on the most relevant parts of input.</p><p>The Neighborhood Attention Extension (NATTEN) allows models that require GPUs or high-performance computing systems to process information and generate outputs more efficiently.</p><p>Processing speeds can increase by up to 2.6 times, said <a href="https://alihassanijr.com/"><strong>Ali Hassani</strong></a>, a Ph.D. student in the School of Interactive Computing and the creator of NATTEN. Hassani is advised by Associate Professor <a href="https://www.humphreyshi.com/"><strong>Humphrey Shi</strong></a>.</p><p>Hassani is also a research scientist at Nvidia, where he introduced NATTEN to <a href="https://www.nvidia.com/en-us/ai/cosmos/"><strong>Cosmos</strong></a> — a family of WFMs the company uses to train robots, autonomous vehicles, and other physical AI applications.</p><p>“You can map just about anything from a prompt or an image or any combination of frames from an existing video to predict future videos,” Hassani said. “Instead of generating words with an LLM, you’re generating a world.</p><p>“Unlike LLMs that generate a single token at a time, these models are compute-heavy. They generate many images — often hundreds of frames at a time — so the models put a lot of work on the GPU. NATTEN lets us decrease some of that work and proportionately accelerate the model.”</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1763068438</created>  <gmt_created>2025-11-13 21:13:58</gmt_created>  <changed>1763068498</changed>  <gmt_changed>2025-11-13 21:14:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new deep learning architectural framework, Neighborhood Attention Extension (NATTEN), is being used by Nvidia to  increase the processing speed of their Cosmos Predict-2 Model for training autonomous vehicles and humanoid robots.]]></teaser>  <type>news</type>  <sentence><![CDATA[A new deep learning architectural framework, Neighborhood Attention Extension (NATTEN), is being used by Nvidia to  increase the processing speed of their Cosmos Predict-2 Model for training autonomous vehicles and humanoid robots.]]></sentence>  <summary><![CDATA[<p>Georgia Tech Ph.D. student Ali Hassani developed the Neighborhood Attention Extension (NATTEN), a deep learning architectural framework that is being integrated into Nvidia's Cosmos Predict-2 world foundation model. NATTEN enhances the processing speed of neural networks that simulate real-world environments for physical AI systems, which are used to train autonomous vehicles and humanoid robots.&nbsp;</p>]]></summary>  <dateline>2025-11-03T00:00:00-05:00</dateline>  <iso_dateline>2025-11-03T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-11-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678621</item>      </media>  <hg_media>          <item>          <nid>678621</nid>          <type>image</type>          <title><![CDATA[2X6A3487.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[2X6A3487.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/11/13/2X6A3487.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/11/13/2X6A3487.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/11/13/2X6A3487.jpg?itok=TTWF4N4h]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Humprhey Shi and Ali Hassani]]></image_alt>                    <created>1763068473</created>          <gmt_created>2025-11-13 21:14:33</gmt_created>          <changed>1763068473</changed>          <gmt_changed>2025-11-13 21:14:33</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="194609"><![CDATA[Industry]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="194609"><![CDATA[Industry]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="193860"><![CDATA[Artifical Intelligence]]></keyword>          <keyword tid="194701"><![CDATA[go-resarchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="14549"><![CDATA[nvidia]]></keyword>          <keyword tid="191138"><![CDATA[artificial neural networks]]></keyword>          <keyword tid="97281"><![CDATA[autonomous vehicles]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="684058">  <title><![CDATA[Tiny Fans on the Feet of Water Bugs Could Lead to Energy Efficient, Mini Robots]]></title>  <uid>27560</uid>  <body><![CDATA[<div><div><div><div><div><p>A new study explains how tiny water bugs use fan-like propellers to zip across streams at speeds up to 120 body lengths per second. The researchers then created a similar fan structure and used it to propel and maneuver an insect-sized robot.</p><p>The discovery offers new possibilities for designing small machines that could operate during floods or other challenging situations.</p></div></div></div></div></div><div><div><div><div><div><p>Instead of relying on their muscles, the insects about the size of a grain of rice use the water’s surface tension and elastic forces to morph the ribbon-shaped fans on the end of their legs to slice the water surface and change directions.&nbsp;<br><br>Once they understood the mechanism, the team built a self-deployable, one-milligram fan and installed it into an insect-sized robot capable of accelerating, braking, and maneuvering right and left.</p><p>The study is featured<strong> </strong>on the cover of the journal <em>Science.&nbsp;</em><br><br><a href="https://coe.gatech.edu/news/2025/08/tiny-fans-feet-water-bugs-could-lead-energy-efficient-mini-robots">Read the entire story and see the robot in action on the College of Engineering website.&nbsp;</a></p></div></div></div></div></div>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1755807115</created>  <gmt_created>2025-08-21 20:11:55</gmt_created>  <changed>1761333189</changed>  <gmt_changed>2025-10-24 19:13:09</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new study explains how tiny water bugs use fan-like propellers to zip across streams at speeds up to 120 body lengths per second]]></teaser>  <type>news</type>  <sentence><![CDATA[A new study explains how tiny water bugs use fan-like propellers to zip across streams at speeds up to 120 body lengths per second]]></sentence>  <summary><![CDATA[<p>A new study explains how tiny water bugs use fan-like propellers to zip across streams at speeds up to 120 body lengths per second. The researchers then created a similar fan structure and used it to propel and maneuver an insect-sized robot.</p><p>The discovery offers new possibilities for designing small machines that could operate during floods or other challenging situations.</p>]]></summary>  <dateline>2025-08-21T00:00:00-04:00</dateline>  <iso_dateline>2025-08-21T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-08-21 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Researchers built an insect-sized robot that uses surface water and collapsable propellers as an idea to improve fast-moving machines that can operate in rivers or flooded areas. ]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br>College of Engineering<br>maderer@gatech.edu</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>677766</item>      </media>  <hg_media>          <item>          <nid>677766</nid>          <type>image</type>          <title><![CDATA[water-bug-hero.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[water-bug-hero.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/08/21/water-bug-hero.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/08/21/water-bug-hero.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/08/21/water-bug-hero.jpg?itok=ngJx7mnm]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[a water bug standing on water]]></image_alt>                    <created>1755807401</created>          <gmt_created>2025-08-21 20:16:41</gmt_created>          <changed>1755807401</changed>          <gmt_changed>2025-08-21 20:16:41</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>          <group id="1292"><![CDATA[Parker H. Petit Institute for Bioengineering and Bioscience (IBB)]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="194701"><![CDATA[go-resarchnews]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="187423"><![CDATA[go-bio]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="685798">  <title><![CDATA[This Eighth Grader Is Shaping the Future of Wearable Robotics]]></title>  <uid>27255</uid>  <body><![CDATA[<p>Case Neel, 13, is a busy kid who loves coding and robotics, captains his school’s quiz bowl team, and lives with his family on a farm northwest of Atlanta.</p><p>He also has cerebral palsy — and for the past four years, he has played a key role in improving one of the most exciting medical devices at Georgia Tech.</p><p>“My role here is as a participant in exoskeleton research studies,” Case explained. “When I come in, researchers hook me up to sensors that monitor my gait when I’m walking in the device, and then they get a whole lot of data based off that.”</p><p><a href="https://research.gatech.edu/node/44098"><strong>Read more »</strong></a></p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1760728621</created>  <gmt_created>2025-10-17 19:17:01</gmt_created>  <changed>1760728796</changed>  <gmt_changed>2025-10-17 19:19:56</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Like many people with cerebral palsy, Case walks with impaired knee movement. Georgia Tech’s pediatric knee exoskeleton is designed to help children and adolescents walk with increased stability and mobility. Case’s data enables the researchers to analyze]]></teaser>  <type>news</type>  <sentence><![CDATA[Like many people with cerebral palsy, Case walks with impaired knee movement. Georgia Tech’s pediatric knee exoskeleton is designed to help children and adolescents walk with increased stability and mobility. Case’s data enables the researchers to analyze]]></sentence>  <summary><![CDATA[<p>How a middle schooler with cerebral palsy became a vital contributor to Georgia Tech’s cutting-edge robotic exoskeleton research — offering data, feedback, and a passion for innovation.</p>]]></summary>  <dateline>2025-10-13T00:00:00-04:00</dateline>  <iso_dateline>2025-10-13T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-10-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[How a middle schooler with cerebral palsy became a vital contributor to Georgia Tech’s cutting-edge robotic exoskeleton research — offering data, feedback, and a passion for innovation.]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678385</item>      </media>  <hg_media>          <item>          <nid>678385</nid>          <type>image</type>          <title><![CDATA[26-R10410-P29-006_EDITED.jpg]]></title>          <body><![CDATA[<p>Kinsey Herrin, principal research scientist in the George W. Woodruff School of Mechanical Engineering, leads exoskeleton and prosthetic studies and fosters meaningful connections with the participant community.</p>]]></body>                      <image_name><![CDATA[26-R10410-P29-006_EDITED.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/10/17/26-R10410-P29-006_EDITED.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/10/17/26-R10410-P29-006_EDITED.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/10/17/26-R10410-P29-006_EDITED.jpg?itok=oG4qdRcH]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Person wearing a floral-patterned shirt interacting with a group of people indoors; one individual is dressed in a bright yellow button-up shirt.]]></image_alt>                    <created>1760728650</created>          <gmt_created>2025-10-17 19:17:30</gmt_created>          <changed>1760728650</changed>          <gmt_changed>2025-10-17 19:17:30</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="66220"><![CDATA[Neuro]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="193656"><![CDATA[Neuro Next Initiative]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="685070">  <title><![CDATA[The Robotic Breakthrough That Could Help Stroke Survivors Reclaim Their Stride]]></title>  <uid>36410</uid>  <body><![CDATA[<p>Crossing a room shouldn’t feel like a marathon. But for many stroke survivors, even the smallest number of steps carries enormous weight. Each movement becomes a reminder of lost coordination, muscle weakness, and physical vulnerability.</p><p>A team of Georgia Tech researchers wanted to ease that struggle, and robotic exoskeletons offered a promising path. Their findings point to a simple but powerful shift: exoskeletons that adapt to people, rather than forcing people to adapt to the machine. Using artificial intelligence (AI) to learn the rhythm of patients’ strides in real time, the team showed how these devices can reduce strain and increase efficiency. They also demonstrated how the technology can help restore confidence for stroke survivors.&nbsp;<br><br><strong>The Robot Finds the Rhythm</strong></p><p>A robotic exoskeleton is a wearable device that helps people move with mechanical support. Traditional exoskeletons require endless manual adjustments — turning knobs, calibrating settings, and tweaking controls.&nbsp;</p><p>“It can be frustrating, even nearly impossible, to get it right for each person,” said <a href="https://www.me.gatech.edu/faculty/young">Aaron Young</a>, associate professor in the <a href="https://www.me.gatech.edu/">George W. Woodruff School of Mechanical Engineering.</a> “With AI, the exoskeleton figures out the mapping itself. It learns the timing of someone’s gait through a neural network, without an engineer needing to hand-tune everything.”</p><p>The software monitors each step, instantly updates, and fine-tunes the support it provides. Over time, the exoskeleton aligns its movements with the unique gait of the person wearing it. In this study, the research team used a hip exoskeleton, which provides torque at the hip joint — in other words, adding power to help stroke survivors walk or move their legs more easily.<br>&nbsp;</p><p><strong>Taking Smarter Steps</strong></p><p>Walking after a stroke can be tough and unpredictable. A patient’s stride can change from one day to the next, and even from one step to the next. Most exoskeletons aren’t built for that kind of variation. They are designed around the steady, even gait of healthy young adults, which can leave stroke survivors feeling more unsteady than supported.</p><p>Young’s breakthrough, detailed in <a href="https://ieeexplore.ieee.org/abstract/document/11112638"><em>IEEE Transactions on Robotics</em>,</a> is a neural network — a type of AI that learns patterns much like the human brain does. Sensors at the hip pick up how someone is moving, and the network translates those signals into just the right boost of power to support each step. It quickly figures out a person’s unique walking pattern. But lead clinician Kinsey Herrin said the AI’s learning doesn’t stop there. It keeps adjusting as the patient walks, so the exoskeleton can stay in sync even during stride shifts.</p><p>“The speed really surprised us,” Young said. “In just one to two minutes of walking, the system had already learned a person’s gait pattern with high accuracy. That’s a big deal, to adapt that quickly and then keep adapting as they move.”</p><p>Tests showed the system was far more accurate than the standard exoskeleton. It reduced errors in tracking stroke patients’ walking patterns by 70%.</p><p>Young emphasized that this research is about more than metrics. “When you see someone able to walk farther without becoming exhausted, that’s when you realize this isn’t just about robotics — it’s about giving people back a measure of independence,” he said.<br>&nbsp;</p><p><strong>Adapting Anywhere</strong></p><p>Every exoskeleton comes with its own set of sensors, so the data they collect can look completely different from one device to the next. A neural network trained on one machine often stumbles when it’s moved to another. To get around that, Young’s team designed software that works like a universal adapter plug — no matter what device it’s connected to, it converts the signals into a form the AI can use. After just 10 strides of calibration, the system cut error rates by more than 75%.</p><p>“The goal is that someone could strap on a device, and, within a minute, it feels like it was built just for them,” Young said.<br><br><br><strong>A Step Toward the Future</strong></p><p>While the study centered on stroke survivors, the implications are far broader. The same adaptive approach could support older adults coping with age-related muscle weakness, people with conditions like Parkinson’s or osteoarthritis, or even children with neurological disabilities.&nbsp;<br>Young and his team are now running clinical trials to measure how well the AI-powered exoskeleton supports people in a wide range of everyday activities.</p><p>“There’s no such thing as an ‘average’ user,” Young said. “The real challenge is designing technology that can adapt to the full spectrum of human mobility.”</p><p>If Georgia Tech’s exoskeleton can rise to that challenge, the promise goes well beyond the lab. It could mean a world where technology doesn’t just help people walk — it learns to walk with them.</p><p>Inseung Kang, who holds a B.S., M.S., and Ph.D. from Georgia Tech, is the paper’s lead author and now an assistant professor of mechanical engineering at Carnegie Mellon University. He explained that the real promise is in what comes next.&nbsp;<br><br>“We’ve developed a system that can adjust to a person’s walking style in just minutes. But the potential is even greater. Imagine an exoskeleton that keeps learning with you over your lifetime, adjusting as your body and mobility change. Think of it as a robot companion that understands how you walk and gives you the right assistance every step of the way.”<br><br>&nbsp;</p><p><em>Aaron Young is affiliated with Georgia Tech’s&nbsp;</em><a href="https://research.gatech.edu/robotics"><em>Institute for Robotics and Intelligent Machines</em></a>.</p><p><em>This research was primarily funded by a grant (DP2HD111709-01)&nbsp;from the National Institutes of Health New Innovator Award Program. </em>Georgia Tech researchers have created the first lung-on-a-chip with a functioning immune system, allowing it to respond to infections much like a real human lung. The breakthrough, published in <em>Nature Biomedical Engineering</em>, provides a more accurate way to study diseases, test therapies, and reduce reliance on animal models. With potential applications in conditions from influenza to cancer, the technology opens the door to personalized medicine that predicts how individual patients will respond to treatment.</p>]]></body>  <author>mazriel3</author>  <status>1</status>  <created>1758209214</created>  <gmt_created>2025-09-18 15:26:54</gmt_created>  <changed>1758726539</changed>  <gmt_changed>2025-09-24 15:08:59</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech's AI-fueled exoskeleton adapts to every step, helping patients relearn to walk with less effort and more confidence.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech's AI-fueled exoskeleton adapts to every step, helping patients relearn to walk with less effort and more confidence.]]></sentence>  <summary><![CDATA[<p>Georgia Tech researchers have developed an AI-powered hip exoskeleton that adapts in real time to a stroke survivor’s changing gait, reducing errors by 70% and helping patients walk with greater ease and confidence. Unlike traditional devices that require constant manual tuning, the system learns each person’s unique stride within minutes and continues adjusting as they move. The breakthrough could extend beyond stroke recovery, offering personalized mobility support for people of all ages and conditions.</p>]]></summary>  <dateline>2025-09-18T00:00:00-04:00</dateline>  <iso_dateline>2025-09-18T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-09-18 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[mazriel3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Michelle Azriel Sr. Writer - Editor</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678071</item>      </media>  <hg_media>          <item>          <nid>678071</nid>          <type>video</type>          <title><![CDATA[The Robotic Breakthrough That Could Help Stroke Survivors Reclaim Their Stride]]></title>          <body><![CDATA[<p>Georgia Tech's AI-fueled exoskeleton adapts to every step, helping patients relearn to walk with less effort and more confidence.Traditional robotic exoskeleton models require extensive manual calibration, but Aaron Young, associate professor in the George W. Woodruff School of Mechanical Engineering, and his team developed AI-driven software that automatically adapts to each user’s gait. By using a neural network, the system continuously monitors and adjusts support with each step, gradually syncing with the wearer’s unique movement. In this study, the team used a hip exoskeleton that delivers torque at the hip joint to help stroke survivors walk more easily.</p>]]></body>                      <youtube_id><![CDATA[RPHz2mU9sBA]]></youtube_id>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>            <vimeo_id><![CDATA[]]></vimeo_id>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>            <video_url><![CDATA[https://youtu.be/RPHz2mU9sBA]]></video_url>            <video_width><![CDATA[]]></video_width>            <video_height><![CDATA[]]></video_height>                    <created>1758208325</created>          <gmt_created>2025-09-18 15:12:05</gmt_created>          <changed>1758208325</changed>          <gmt_changed>2025-09-18 15:12:05</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="66220"><![CDATA[Neuro]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="194701"><![CDATA[go-resarchnews]]></keyword>          <keyword tid="13169"><![CDATA[autonomous robots]]></keyword>          <keyword tid="98751"><![CDATA[College of Engineering; George W. Woodruff School of Mechanical Engineering]]></keyword>          <keyword tid="172970"><![CDATA[go-neuro]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="685002">  <title><![CDATA[Two IC Faculty Receive NSF CAREER for Robotics and AR/VR Initiatives]]></title>  <uid>36530</uid>  <body><![CDATA[<p>Practice may not make perfect for robots, but new machine learning models from Georgia Tech are allowing them to improve their skillsets to more effectively assist humans in the real world.&nbsp;</p><p><a href="https://faculty.cc.gatech.edu/~danfei/"><strong>Danfei Xu</strong></a>, an assistant professor in <a href="https://ic.gatech.edu/"><strong>Georgia Tech’s School of Interactive Computing</strong></a>, is introducing new models that provide robots with “on-the-job” training.</p><p>The National Science Foundation (NSF) awarded Xu its CAREER award given to early career faculty. The award will enable Xu to expand his research and refine his models, which could accelerate the process of robot deployment and alleviate manufacturers from the burden of achieving perfection.</p><p>“The main problem we’re trying to tackle is how to allow robots to learn on the job,” Xu said. “How should it self-improve based on the performance or the new requirements or new user preferences in each home or working environment? You cannot expect a robot manufacturer to program all of that.</p><p>“The challenging thing about robotics is that the robot must get feedback from the physical environment. It must try to solve a problem to understand the limits of its abilities so it can decide how to improve its own performance.”</p><p>As with humans, Xu views practice as the most effective way for a robot to improve a skill. His models train the robot to identify the point at which it failed in its task performance.</p><p>“It identifies that skill and sets up an environment where it can practice,” he said. “If it needs to improve opening a drawer, it will navigate itself to the drawer and practice opening it.”</p><p>The models allow the robot to split tasks into smaller parts and evaluate its own skill level using reward functions. Cooking dinner, for example, can be divided into steps like turning on the stove and opening the fridge, which are necessary to achieve the overall goal.</p><p>“Planning is a complex problem because you must predict what’s going to happen in the physical world,” Xu said. “We use machine learning techniques that our group has developed over the past two years, using generated models to generate positive futures. They’re very good at modeling long-horizon phenomena.</p><p>“The robot knows when it’s failed because there’s a value that tells it how well it performed the task and whether it received its reward. While we don’t know how to tell the robot why it failed, we have ways for it to improve its skills based on that measurement.”&nbsp;</p><p>One of the biggest barriers that keeps many robots from being made available for public use is the pressure on manufacturers to make the robot as close to perfect as possible at deployment. Xu said it’s more practical to accept that robots will have learning gaps that need to be filled and to implement more efficient real-world learning models.</p><p>“We work under the pressure of getting everything correct before deployment,” he said. “We need to meet the basic safety requirements, but in terms of competence, it is difficult to get that perfect at deployment. This takes some of the pressure off because it will be able to self-adapt.”</p><h4><strong>Virtual Workspace for Data Workers</strong></h4><p><a href="https://ivi.cc.gatech.edu/people.html"><strong>Yalong Yang</strong></a>, another assistant professor in the School of IC, also received the NSF CAREER Award for a research proposal that will design augmented and virtual reality (AR/VR) workspaces for data workers.&nbsp;</p><p>“In 10 years, I envision everyone will use AR/VR in their office, and it will replace their laptop or their monitor,” Yang said.</p><p>Yang said he is also working with Google on the project and using Google Gemini to bring conventional applications to immersive space, with data tools being the most complicated systems to re-design for immersive environments.</p><p>The immersive workspace and interface will also enable teams of data workers to collaborate and share their data in real-time.</p><p>“I want to support the end-to-end process,” Yang said. “We have visualization tools for data, but it’s not enough. Data science is a pipeline — from collecting data to processing, visualizing, modeling and then communicating. If you only support one, people will need to switch to other platforms for the other steps.”</p><p>Yang also noted that prior research has shown that VR can enhance cognitive abilities, such as memory and attention and support multitasking. The results of his project could lead to maximizing worker efficiency without them feeling strained.</p><p>“We all have a cognitive limit in our working memory. Using AR/VR can increase those limits and process more information. We can expand people’s spatial ability to help them build a better mental model of the data presented to them.”</p><p>Yang was also recently named a <a href="https://www.cc.gatech.edu/news/tiktok-photoshop-generative-ai-could-bring-millions-apps-3d-reality"><strong>2025 Google Research Scholar</strong></a> as he seeks to build a new artificial intelligence (AI) tool that converts mobile apps into 3D immersive environments.</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1758133463</created>  <gmt_created>2025-09-17 18:24:23</gmt_created>  <changed>1758133731</changed>  <gmt_changed>2025-09-17 18:28:51</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Two Georgia Tech professors, Danfei Xu and Yalong Yang, have received the prestigious NSF CAREER award for their research in robotics, which focuses on teaching robots to self-improve, and in augmented and virtual reality (AR/VR), which aims to create imm]]></teaser>  <type>news</type>  <sentence><![CDATA[Two Georgia Tech professors, Danfei Xu and Yalong Yang, have received the prestigious NSF CAREER award for their research in robotics, which focuses on teaching robots to self-improve, and in augmented and virtual reality (AR/VR), which aims to create imm]]></sentence>  <summary><![CDATA[<p>Two assistant professors in Georgia Tech’s School of Interactive Computing — Danfei Xu and Yalong Yang — have each won NSF CAREER Awards for their respective research in robotics and AR/VR initiatives. Xu’s work will develop machine learning models that let robots learn “on the job,” adapting from feedback and failure in real-world environments rather than being perfectly preprogrammed. Yang’s project aims to build immersive AR/VR workspaces to support data workers across the full data pipeline, including a collaboration with Google to bring conventional apps into immersive environments.</p>]]></summary>  <dateline>2025-09-17T00:00:00-04:00</dateline>  <iso_dateline>2025-09-17T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-09-17 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>678055</item>      </media>  <hg_media>          <item>          <nid>678055</nid>          <type>image</type>          <title><![CDATA[ICRA-2025_86A9079-Enhanced-NR.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ICRA-2025_86A9079-Enhanced-NR.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/09/17/ICRA-2025_86A9079-Enhanced-NR.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/09/17/ICRA-2025_86A9079-Enhanced-NR.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/09/17/ICRA-2025_86A9079-Enhanced-NR.jpg?itok=Wz_zxhQx]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Danfei Xu]]></image_alt>                    <created>1758133475</created>          <gmt_created>2025-09-17 18:24:35</gmt_created>          <changed>1758133475</changed>          <gmt_changed>2025-09-17 18:24:35</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="191934"><![CDATA[National Science Foundation (NSF)]]></keyword>          <keyword tid="7842"><![CDATA[NSF CAREER Award]]></keyword>          <keyword tid="188776"><![CDATA[go-research]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="145251"><![CDATA[virtual reality]]></keyword>          <keyword tid="1597"><![CDATA[Augmented Reality]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="684700">  <title><![CDATA[Georgia Tech Team Designing Robot Guide Dog to Assist the Visually Impaired]]></title>  <uid>32045</uid>  <body><![CDATA[<p>People who are visually impaired and cannot afford or care for service animals might have a practical alternative in a robotic guide dog being developed at Georgia Tech.</p><p>Before launching its prototype, a research team within Georgia Tech’s School of Interactive Computing, led by Professor <strong>Bruce Walker</strong> and Assistant Professor <strong>Sehoon Ha</strong>, is working to improve its methods and designs based on research within blind and visually impaired (BVI) communities.</p><p>“There’s been research on the technical aspects and functionality of robotic guide dogs, but not a lot of emphasis on the aesthetics or form factors,” said <strong>Avery</strong> <strong>Gong</strong>, a recent master’s graduate who worked in Walker’s lab. “We wanted to fill this gap.”</p><p>Training a guide dog can cost up to $50,000, and while there are nonprofit organizations that can cover these costs for potential owners, there is still a gap between the amount of available guide dogs and BVI individuals who need them. Not all BVI individuals are able to care for a dog and feed it. The dog also has fewer than 10 working years before it needs replacement.</p><p>Gong co-authored a paper on the design implications of the robotic guide dog that was presented at the 2025 International Conference on Robotics and Automation (ICRA) in Atlanta in May.</p><p>The consensus among the study’s participants indicates they prefer a robotic guide dog that:</p><ul><li>resembles a real dog and appears approachable</li><li>has a clear identifier of being a guide dog, such as a vest</li><li>has built-in GPS and Bluetooth connectivity</li><li>has control options such as voice command</li><li>has soft textures without feeling furry</li><li>has long battery life and self-charging capability</li></ul><p>“A lot of people said they didn’t want the dog to look too cute or appealing because it would draw too much attention,” said <strong>Aviv Cohav</strong>, another lead author of the paper and recent master’s graduate.</p><p>“Many people have issues with taking their guide dog to places, whether it’s little kids wanting to play with the dog or people not liking dogs or people being scared of them, and that reflects on the owners themselves. We wanted to look at what would be a good balance between having a functional robot that wouldn’t scare people away or be a distraction.”</p><p>The researchers also had to consider the perspectives of sighted individuals and how society at large might view a robotic guide dog.</p><p>An example of this is the amount of noise the dog makes while walking. The owner needs to hear the dog is active, but the clanky sound many off-the-shelf robots make could create disturbances in indoor spaces that amplify sounds. To offset the noise, the team developed algorithms that allow the robot to move more quietly.</p><p>Walker and his lab have examined similar scenarios that must take public perception into account.</p><p>“We like to think of Georgia Tech as going the extra mile,” Walker said. “Let’s not just make a robot, but a robot that’s going to fit into society.</p><p>“To have impact, the technologies we produce must be produced with society in mind. This is a holistic design that considers the users and all the people with whom the users interact.”</p><p><strong>Taery Kim</strong>, a computer science Ph.D. student, began working on the concept of a robotic guide dog when she came to Georgia Tech in 2022. She and Ha, her advisor, have authored papers on building the robot’s navigation and safety components.&nbsp;</p><p>“When I started, I thought it would be as simple as giving the guide dog a command to take me to Starbucks or the grocery store, and it would just take me,” Kim said. “But the user must give waypoint directions — ‘go left here,’ ‘turn right,’ ‘go forward,’ ‘stop.’ Detailed commands must be delivered to the dog.”</p><p>While a real dog has naturally enhanced senses of hearing and smell that can’t be replicated, technology can provide interconnected safety features during an emergency. The researchers envision a camera system equipped with a 360-degree field of view, computer vision algorithms that detect obstacles or hazards, and voice recognition that recognizes calls for help. An SOS function could automatically call 911 at the owner’s request or if the owner is unresponsive.</p><p>Kim said the robot should also have explainability features to enhance communication with the owner. For example, if the robot suddenly stops or ignores an owner’s commands, it should tell the owner that it’s detecting a hazard in their path.</p><p>Manufacturing a robot at scale would initially be expensive, but the researchers believe the cost would eventually be offset because of its longevity. BVI individuals may only need to purchase one during their lifetime.</p><p>To introduce a prototype, the multidisciplinary research team recognizes that it needs to enlist experts from other fields to adequately address the various implications and research gaps inherent in the project.</p><p>Walker said the teams welcome additional partners who are keen to tackle challenges ranging from design and engineering to battery life to human-robot interaction.</p><p>Team member <strong>J. Taery Kim</strong> was supported by the National Science Foundation's Graduate Research Fellowship Program (NSF GRFP) under Grant No. DGE-2039655.</p>]]></body>  <author>Ben Snedeker</author>  <status>1</status>  <created>1757509079</created>  <gmt_created>2025-09-10 12:57:59</gmt_created>  <changed>1758127447</changed>  <gmt_changed>2025-09-17 16:44:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers rely on feedback from blind and visually impaired (BVI) communities to create service animal prototype.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers rely on feedback from blind and visually impaired (BVI) communities to create service animal prototype.]]></sentence>  <summary><![CDATA[<p>Georgia Tech researchers from the School of Interactive Computing are using survey information from individuals who are blind or visually impaired (BVI) to develop a robotic service dog.</p>]]></summary>  <dateline>2025-09-10T00:00:00-04:00</dateline>  <iso_dateline>2025-09-10T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-09-10 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Nathan Deen, Communications Officer<br>School of Interactive Computing</p><p>nathan.deen@cc.gatech.edu</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>677956</item>          <item>677957</item>      </media>  <hg_media>          <item>          <nid>677956</nid>          <type>image</type>          <title><![CDATA[Georgia Tech researchers test their prototype of a robotic guide dog. Photo by Terence Rushin/College of Computing.]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Robotic-Seeing-Eye-Dog_86A0019-Enhanced-NR.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/09/10/Robotic-Seeing-Eye-Dog_86A0019-Enhanced-NR.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/09/10/Robotic-Seeing-Eye-Dog_86A0019-Enhanced-NR.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/09/10/Robotic-Seeing-Eye-Dog_86A0019-Enhanced-NR.jpg?itok=ULOJYgOx]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Georgia Tech researchers test their prototype of a robotic guide dog. Photo by Terence Rushin/College of Computing.]]></image_alt>                    <created>1757509562</created>          <gmt_created>2025-09-10 13:06:02</gmt_created>          <changed>1757509562</changed>          <gmt_changed>2025-09-10 13:06:02</gmt_changed>      </item>          <item>          <nid>677957</nid>          <type>image</type>          <title><![CDATA[A graphic depicts design considerations for the prototype.]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Robotic-Dog-Story-01-20-.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/09/10/Robotic-Dog-Story-01-20-.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/09/10/Robotic-Dog-Story-01-20-.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/09/10/Robotic-Dog-Story-01-20-.jpg?itok=Y-Ee-LqE]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[A graphic depicts design considerations for the prototype.]]></image_alt>                    <created>1757509677</created>          <gmt_created>2025-09-10 13:07:57</gmt_created>          <changed>1757509677</changed>          <gmt_changed>2025-09-10 13:07:57</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://youtu.be/4CzDPxaVWkI?feature=shared]]></url>        <title><![CDATA[VIDEO: Robotic guide dogs could reshape the future for the blind and visually impaired]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="443951"><![CDATA[School of Psychology]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="188087"><![CDATA[go-irim]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="172970"><![CDATA[go-neuro]]></keyword>      </keywords>  <core_research_areas>          <term tid="193656"><![CDATA[Neuro Next Initiative]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="683686">  <title><![CDATA[Research Combining Humans, Robots, and Unicycles Receives NSF Award]]></title>  <uid>27863</uid>  <body><![CDATA[<p>Research into tailored assistive and rehabilitative devices has seen recent advancements but the goal remains out of reach due to the sparsity of data on how humans learn complex balance tasks. To address this gap, a collaborating team of interdisciplinary faculty from Florida State University and Georgia Tech have been awarded ~$798,000 by the NSF to launch a study to better understand human motor learning as well as gain greater understanding into human robot interaction dynamics during the learning process.</p><p>&nbsp;Led by PI:&nbsp;<a href="https://rthmlab.wixsite.com/taylorgambon">Taylor Higgins</a>, Assistant Professor, FAMU-FSU Department of Mechanical Engineering, partnering with Co-PIs&nbsp;<a href="https://www.shreyaskousik.com/">Shreyas Kousik</a>, Assistant Professor, Georgia Tech, George W. Woodruff School of Mechanical Engineering, and&nbsp;<a href="https://annescollege.fsu.edu/faculty-staff/dr-brady-decouto">Brady DeCouto,</a> Assistant Professor, FSU&nbsp;Anne Spencer Daves College of Education, Health, and Human Sciences, the research will use the acquisition of unicycle riding skill by participants to gain a better grasp on human motor learning in tasks requiring balance and complex movement in space. Although it might sound a bit odd, the fact that most people don’t know how to ride a unicycle, and the fact that it requires balance, mean that the data will cover the learning process from novice to skilled across the participant pool.</p><p>Using data acquired from human participants, the team will develop a “robotics assistive unicycle” that will be used in the training of the next pool of novice unicycle riders. &nbsp;This is to gauge if, and how rapidly, human motor learning outcomes improve with the assistive unicycle. The participants that engage with the robotic unicycle will also give valuable insight into developing effective human-robot collaboration strategies.</p><p>The fact that deciding to get on a unicycle requires a bit of bravery might not be great for the participants, but it’s great for the research team. The project will also allow exploration into the interconnection between anxiety and human motor learning to discover possible alleviation strategies, thus increasing the likelihood of positive outcomes for future patients and consumers of these devices.</p><p>&nbsp;</p><p>Author<br>-Christa M. Ernst</p><p>This Article Refers to NSF Award # 2449160</p>]]></body>  <author>Christa Ernst</author>  <status>1</status>  <created>1754681755</created>  <gmt_created>2025-08-08 19:35:55</gmt_created>  <changed>1755008137</changed>  <gmt_changed>2025-08-12 14:15:37</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Novel research to improve tailored assistive and rehabilitative devices wins NSF Grant]]></teaser>  <type>news</type>  <sentence><![CDATA[Novel research to improve tailored assistive and rehabilitative devices wins NSF Grant]]></sentence>  <summary><![CDATA[<p>A collaborating team of interdisciplinary faculty from Florida State University and Georgia Tech have been awarded ~$798,000 by the NSF to launch a study to better understand human motor learning as well as gain greater understanding into human robot interaction dynamics during the learning process.</p>]]></summary>  <dateline>2025-08-08T00:00:00-04:00</dateline>  <iso_dateline>2025-08-08T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-08-08 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Trio from Florida State University and Georgia Tech aim to develop better assistive and rehabilitative technologies and strategies using novel approach.]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[christa.ernst@research.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<div><strong>Christa M. Ernst</strong></div><div>Research Communications Program Manager</div><div>Klaus Advance Computing Building 1120E | 266 Ferst Drive | Atlanta GA | 30332</div><div><strong>Topic Expertise: Robotics | Data Sciences | Semiconductor Design &amp; Fab</strong></div><div>christa.ernst@research.gatech.edu</div>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>677632</item>      </media>  <hg_media>          <item>          <nid>677632</nid>          <type>image</type>          <title><![CDATA[Kousik-NSF-Award-News-Graphic.png]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Kousik-NSF-Award-News-Graphic.png]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/08/08/Kousik-NSF-Award-News-Graphic.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/08/08/Kousik-NSF-Award-News-Graphic.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/08/08/Kousik-NSF-Award-News-Graphic.png?itok=5xmuJ9X7]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Graphic of person using an assistive device thinking about how a robot could hep learn riding a unicycle]]></image_alt>                    <created>1754681767</created>          <gmt_created>2025-08-08 19:36:07</gmt_created>          <changed>1754681767</changed>          <gmt_changed>2025-08-08 19:36:07</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="545781"><![CDATA[Institute for Data Engineering and Science]]></group>          <group id="142761"><![CDATA[IRIM]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="78841"><![CDATA[human-robot interaction]]></keyword>          <keyword tid="5525"><![CDATA[assistive technologies]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="187582"><![CDATA[go-ibb]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="193656"><![CDATA[Neuro Next Initiative]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="682404">  <title><![CDATA[Researchers Say Stress “Sweet Spot” Can Improve Remote Operators' Performance]]></title>  <uid>36530</uid>  <body><![CDATA[<p>Military drone pilots, disaster search and rescue teams, and astronauts stationed on the International Space Station are often required to remotely control robots while maintaining their concentration for hours at a time.</p><p>Georgia Tech roboticists are attempting to identify the most stressful periods that human teleoperators experience while performing tasks remotely. A novel study provides new insights into determining when a teleoperator needs to operate at a high level of focus and which parts of the task can be delegated to robot automation.</p><p>School of Interactive Computing Associate Professor <strong>Matthew</strong> <strong>Gombolay</strong> calls it the “sweet spot” of human ingenuity and robotic precision. Gombolay and students from his <a href="https://core-robotics.gatech.edu/"><strong>CORE Robotics Lab</strong></a>conducted a novel study that measures stress and workload on human teleoperators.</p><p>Gombolay said it can inform military officials on how to strategically implement task automation and maximize human teleoperator performance.</p><p>Humans continue to hand over more tasks to robots to perform, but Gombolay said that some functions will still require human input and oversight for the foreseeable future.</p><p>Specific applications, such as space exploration, commercial and military aviation, disaster relief, and search and rescue, pose substantial safety concerns. Astronauts stationed on the International Space Station, for example, manually control robots that bring in supplies, move cargo, and make structural repairs.</p><p>“It’s brutal from a psychological perspective,” Gombolay said.</p><p>The question often asked about automating a task in these fields is, at what point can a robot be trusted more than a human?</p><p>A recent paper by Gombolay and his current and former students — <strong>Sam</strong> <strong>Yi</strong> <strong>Ting</strong>, <strong>Erin</strong> <strong>Hedlund</strong>-<strong>Botti</strong>, and <strong>Manisha</strong> <strong>Natarajan</strong> — sheds new light on the debate. The paper was published in the IEEE Robotics and Automation Letters and will be presented at the International Conference on Robotics and Automation in Atlanta.</p><p>The NASA-funded study can identify which aspects of tedious, time-consuming tasks can be automated and which require human supervision. If roboticists can pinpoint the elements of a task that cause the least stress, they can automate these components and enable humans to oversee the more challenging aspects.</p><p>“If we’re talking about repetitive tasks, robots do better with that, so if you can automate it, you should,” said Ting, a former grad student and lead author of the paper. “I don’t think humans enjoy doing repetitive tasks. We can move toward a better future with automation.”</p><p>Military officials, for example, could measure the stress of remote drone pilots and know which times during a pilot’s shift require the highest level of attention.</p><p>“We can get a sense of how stressed you are and create models of how divided your attention is and the performance rate of the tasks you’re doing,” Gombolay said.</p><p>“It can be a low-stress or high-stress situation depending on the stakes and what’s going on with you personally. Are you well-caffeinated? Well-rested? Is there stress from home you’re bringing with you to the workplace? The goal is to predict how good your task performance will be. If it indicates it might be poor, we may need to outsource work to other people or create a safe space for the operator to destress.”</p><h4><strong>The Stress Test</strong></h4><p>For their study, the researchers cut a small river-shaped path into a medium-density fiberboard. The exercise required the 24 participants to use a remote robotic arm to navigate through the path from one end to the other without touching the edges.</p><p>The experiment grew more challenging as new stress conditions and workload requirements were introduced. The changing conditions required the test participants to multitask to complete the assignment.</p><p>Gombolay said the study supports the Yerkes-Dodson Law, which states that moderate levels of stress increase human performance.</p><p>The experiment showed that operators felt overwhelmed and performed poorly when multitasking was introduced. Too much stress led to poor performance, but a moderate amount of stress induced more engagement and enhanced teleoperator focus.&nbsp;</p><p>Ting said finding that ideal stress zone can lead to a higher performance rating.&nbsp;</p><p>“You would think the more stressed you are, the more your performance decreases,” Ting said. “Most people didn’t react that way. As stress increased, performance increased, but when you increased workload and gave them more to do, that’s when you started seeing deteriorating performance.”</p><p>Gombolay said no stress can be just as detrimental as too much stress. Performing a task without stress tends to cause teleoperators to become disinterested, especially if it is repetitive and time-consuming.</p><p>“No stress led to complacency,” Gombolay said. “They weren’t as engaged in completing the task.</p><p>“If your excitement is too low, you get so bored you can’t muster the cognitive energy to reason about robot operation problems.”</p><h4><strong>The Human Factor</strong></h4><p>Roboticists have made significant leaps in recent years to remove teleoperators from the equation. Still, Gombolay said it’s too early to tell whether robots can be trusted with any task that a human can perform.</p><p>“We’re a long way from full autonomy,” he said. “There’s a lot that robots still can’t do without a human operator. Search and rescue operations, if a building collapses, we don’t have much training data for robots to go through rubble by themselves to rescue people. There are ethical needs for humans to be able to supervise or take direct control of robots.”</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1747314528</created>  <gmt_created>2025-05-15 13:08:48</gmt_created>  <changed>1752591939</changed>  <gmt_changed>2025-07-15 15:05:39</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers say there's a "sweet spot" of stress that can enhance performance of remote robot operators such as drone pilots and astronauts.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers say there's a "sweet spot" of stress that can enhance performance of remote robot operators such as drone pilots and astronauts.]]></sentence>  <summary><![CDATA[<p>Researchers at Georgia Tech are exploring the relationship between stress levels and the performance of remote robot operators. They found a moderate level of of stress can enhance performance and keep operators engaged and focused.&nbsp;</p>]]></summary>  <dateline>2025-05-13T00:00:00-04:00</dateline>  <iso_dateline>2025-05-13T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-05-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>      </media>  <hg_media>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="682890">  <title><![CDATA[Tech Researchers Tabbed to Build AI Systems for Medical Robots in South Korea]]></title>  <uid>36530</uid>  <body><![CDATA[<p>Overwhelmed doctors and nurses struggling to provide adequate patient care in South Korea are getting support from Georgia Tech and Korean-based researchers through an AI-powered robotic medical assistant.</p><p>Top South Korean research institutes have enlisted Georgia Tech researchers <strong>Sehoon</strong> <strong>Ha</strong> and <strong>Jennifer G.</strong> <strong>Kim</strong> to develop artificial intelligence (AI) to help the humanoid assistant navigate hospitals and interact with doctors, nurses, and patients.</p><p>Ha and Kim will partner with Neuromeka, a South Korean robotics company, on a five-year, 10 billion won (about $7.2 million US) grant from the South Korean government. Georgia Tech will receive about $1.8 million of the grant.</p><p>Ha and Kim, assistant professors in the School of Interactive Computing, will lead Tech’s efforts and also work with researchers from the Korea Advanced Institute of Science and Technology and the Electronics and Telecommunications Research Institute.</p><p>Neuromeka has built industrial robots since its founding in 2013 and recently decided to expand into humanoid service robots.</p><p>Lee, the group leader of the humanoid medical assistant project, said he fielded partnership requests from many academic researchers. Ha and Kim stood out as an ideal match because of their robotics, AI, and human-computer interaction expertise.&nbsp;</p><p>For Ha, the project is an opportunity to test navigation and control algorithms he’s developed through research that earned him the National Science Foundation CAREER Award. Ha combines computer simulation and real-world training data to make robots more deployable in high-stress, chaotic environments.&nbsp;</p><p>“Dr. Ha has everything we want to put into our system, including his navigation policies,” Lee said. “He works with robots and AI, and there weren’t many candidates in that space. We needed a collaborator who can create the software and has experience running it on robots.”</p><p>Ha said he is already considering how his algorithms could scale beyond hospitals and become a universal means of robot navigation in unstructured real-world environments.</p><p>“For now, we’re focusing on a customized navigation model for Korean environments, but there are ways to transfer the data set to different environments, such as the U.S. or European healthcare systems,” Ha said.&nbsp;</p><p>“The final product can be deployed to other systems and industries. It can help industrial workers at factories, retail stores, any place where workers can get overwhelmed by a high volume of tasks.”</p><p>Kim will focus on making the robot’s design and interaction features more human. She’ll develop a large-language model (LLM) AI system to communicate with patients, nurses, and doctors. She’ll also develop an app that will allow users to input their commands and queries.&nbsp;</p><p>“This project is not just about controlling robots, which is why Dr. Kim’s expertise in human-computer interaction design through natural language was essential.,” Lee said.&nbsp;</p><p>Kim is interviewing stakeholders from three South Korean hospitals to identify service and care pain points. The issues she’s identified so far relate to doctor-patient communication, a lack of emotional support for patients, and an excessive number of small tasks that consume nurses’ time.</p><p>“Our goal is to develop this robot in a very human-centered way,” she said. “One way is to give patients a way to communicate about the quality of their care and how the robot can support their emotional well-being.</p><p>“We found that patients often hesitate to ask busy nurses for small things like getting a cup of water. We believe this is an area a robot can support.”</p><p>The robot’s hardware will be built in Korea, while Ha and Kim will develop the software in the U.S.</p><p>Jong-hoon Park, CEO of Neuromeka, said in a press release the goal is to have a commercialized product as soon as possible.&nbsp;</p><p>“Through this project, we will solve problems that existing collaborative robots could not,” Park said. “We expect the medical AI humanoid robot technology being developed will contribute to reducing the daily work burden of medical and healthcare workers in the field.”</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1750880997</created>  <gmt_created>2025-06-25 19:49:57</gmt_created>  <changed>1750881315</changed>  <gmt_changed>2025-06-25 19:55:15</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers are collaborating with South Korean research institutes on a five-year grant to develop an AI-powered humanoid medical assistant to help doctors and nurses in South Korea.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers are collaborating with South Korean research institutes on a five-year grant to develop an AI-powered humanoid medical assistant to help doctors and nurses in South Korea.]]></sentence>  <summary><![CDATA[<p>Georgia Tech researchers Sehoon Ha and Jennifer Kim are working with South Korean institutions to create an AI-powered medical assistant robot. This five-year project, funded by a $7.2 million grant from the South Korean government, aims to alleviate the workload of healthcare professionals in South Korea by enabling the robot to navigate hospitals and interact with staff and patients.&nbsp;</p>]]></summary>  <dateline>2025-06-25T00:00:00-04:00</dateline>  <iso_dateline>2025-06-25T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-06-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>677282</item>      </media>  <hg_media>          <item>          <nid>677282</nid>          <type>image</type>          <title><![CDATA[IMG_4499-copy.jpg]]></title>          <body><![CDATA[<p><em>School of Interactive Computing Assistant Professor Sehoon Ha, Neuromeka researchers Joonho Lee and Yunho Kim, School of IC Assistant Professor Jennifer Kim, and Electronics and Telecommunications Research Institute researcher Dongyeop Kang, are collaborating to develop a medical assistant robot to support doctors and nurses in Korea. Photo by Nathan Deen/College of Computing.</em></p>]]></body>                      <image_name><![CDATA[IMG_4499-copy.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/06/25/IMG_4499-copy.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/06/25/IMG_4499-copy.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/06/25/IMG_4499-copy.jpg?itok=5VPD5dev]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researchers]]></image_alt>                    <created>1750881009</created>          <gmt_created>2025-06-25 19:50:09</gmt_created>          <changed>1750881009</changed>          <gmt_changed>2025-06-25 19:50:09</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="194606"><![CDATA[Artificial Intelligence]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="194606"><![CDATA[Artificial Intelligence]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="78681"><![CDATA[medical robotics]]></keyword>          <keyword tid="194391"><![CDATA[AI in Healthcare]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="682761">  <title><![CDATA[Georgia Tech Team Takes Second Place at ICRA Robot Teleoperation Contest]]></title>  <uid>36530</uid>  <body><![CDATA[<p>An algorithmic breakthrough from School of Interactive Computing researchers that&nbsp;<a href="https://www.cc.gatech.edu/news/new-algorithm-teaches-robots-through-human-perspective"><strong>earned a Meta partnership</strong></a>drew more attention at the IEEE International Conference on Robotics and Automation (ICRA).</p><p>Meta announced in February its partnership with the labs of professors&nbsp;<a href="https://faculty.cc.gatech.edu/~danfei/"><strong>Danfei Xu</strong></a> and&nbsp;<a href="https://faculty.cc.gatech.edu/~judy/"><strong>Judy Hoffman</strong></a> on a novel computer vision-based algorithm called EgoMimic. It enables robots to learn new skills by imitating human tasks from first-person video footage captured by Meta’s Aria smart glasses.&nbsp;</p><p>Xu’s&nbsp;<a href="https://rl2.cc.gatech.edu/"><strong>Robot Learning and Reasoning Lab (RL2)</strong></a> displayed EgoMimic in action at ICRA May 19-23 at the World Congress Center in Atlanta.</p><p>Lawrence Zhu, Pranav Kuppili, and Patcharapong “Elmo” Aphiwetsa — students from Xu’s lab — used Egomimic to compete in a robot teleoperation contest at ICRA. The team finished second in the event titled What Bimanual Teleoperation and Learning from Demonstration Can Do Today, earning a $10,000 cash prize.</p><p>Teams were challenged to perform tasks by remotely controlling a robot gripper. The robot had to fold a tablecloth, open a vacuum-sealed container, place an object into the container, and then reseal it in succession without any errors.</p><p>Teams completed the tasks as many times as possible in 30 minutes, earning points for each successful attempt.</p><p>The competition also offered different challenge levels that increased the points awarded. Teams could directly operate the robot with a full workstation view and receive one point for each task completion. Or, as the RL2 team chose, teams could opt for the second challenge level.</p><p>The second level required an operator to control the task with no view of the workstation except for what was provided to through a video feed. The RL2 team completed the task seven times and received double points for the challenge level.</p><p>The third challenge level required teams to operate remotely from another location. At this level, teams could earn four times the number of points for each successful task completed. The fourth level challenged teams to deploy an algorithm for task performance and awarded eight points for each completion.</p><p>Using two of Meta’s Quest wireless controllers, Zhu controlled the robot under the direction of Aphiwetsa, while Kuppili monitored the coding from his laptop.</p><p>“It’s physically difficult to teleoperate for half an hour,” Zhu said. “My hands were shaking from holding the controllers in the air for that long.”</p><p>Being in constant communication with Aphiwetsa helped him stay focused throughout the contest.</p><p>“I helped him strategize the teleoperation and noticed he could skip some of the steps in the folding,” Aphiwetsa said. “There were many ways to do it, so I just told him what he could fix and how to do it faster.”</p><p>Zhu said he and his team had intended to tackle the fourth challenge level with the EgoMimic algorithm. However, due to unexpected time constraints, they decided to switch to the second level the day before the competition due to unexpected time constraints.&nbsp;</p><p>“I think we realized the day before the competition training the robot on our model would take a huge amount of time,” Zhu said. “We decided to go for the teleoperation and started practicing.”</p><p>He said the team wants to tackle the highest challenge level and use a training model for next year’s ICRA competition in Vienna, Austria.</p><p>ICRA is the world’s largest robotics conference, and&nbsp;<a href="https://www.cc.gatech.edu/news/georgia-tech-leads-robotics-world-converges-atlanta-icra-2025"><strong>Atlanta hosted the event</strong></a> for the third time in its history, drawing a record-breaking attendance of over 7,000.</p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1749655482</created>  <gmt_created>2025-06-11 15:24:42</gmt_created>  <changed>1749729176</changed>  <gmt_changed>2025-06-12 11:52:56</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A Georgia Tech team earned second place in the ICRA Robot Teleoperation Contest for their EgoMimic algorithm, which allows robots to learn skills by mimicking human tasks from first-person video.]]></teaser>  <type>news</type>  <sentence><![CDATA[A Georgia Tech team earned second place in the ICRA Robot Teleoperation Contest for their EgoMimic algorithm, which allows robots to learn skills by mimicking human tasks from first-person video.]]></sentence>  <summary><![CDATA[<p>Students from Georgia Tech's Robot Learning and Reasoning Lab earned second place and a $10,000 cash prize in a robot teleoperation contest at the 2025 International Conference on Robotics and Automation in Atlanta. The RL2 lab announced a partnership with Meta in February on a novel computer vision-based algorithm called EgoMimic. It enables robots to learn new skills by imitating human tasks from first-person video footage captured by Meta’s Aria smart glasses.</p>]]></summary>  <dateline>2025-06-11T00:00:00-04:00</dateline>  <iso_dateline>2025-06-11T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-06-11 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>677223</item>      </media>  <hg_media>          <item>          <nid>677223</nid>          <type>image</type>          <title><![CDATA[IMG_4291-2-copy.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[IMG_4291-2-copy.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/06/12/IMG_4291-2-copy.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/06/12/IMG_4291-2-copy.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/06/12/IMG_4291-2-copy.jpg?itok=f261J8gE]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[ICRA]]></image_alt>                    <created>1749729142</created>          <gmt_created>2025-06-12 11:52:22</gmt_created>          <changed>1749729142</changed>          <gmt_changed>2025-06-12 11:52:22</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="152"><![CDATA[Robotics]]></category>          <category tid="193158"><![CDATA[Student Competition Winners (academic, innovation, and research)]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="152"><![CDATA[Robotics]]></term>          <term tid="193158"><![CDATA[Student Competition Winners (academic, innovation, and research)]]></term>      </news_terms>  <keywords>          <keyword tid="181920"><![CDATA[cc-research; ic-ai-ml; ic-robotics]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="167585"><![CDATA[student competition]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="681961">  <title><![CDATA[Thesis on Human-Centered AI Earns Honors from International Computing Organization]]></title>  <uid>36319</uid>  <body><![CDATA[<p>A Georgia Tech alum’s dissertation introduced ways to make artificial intelligence (AI) more accessible, interpretable, and accountable. Although it’s been a year since his doctoral defense,&nbsp;<a href="https://zijie.wang/"><strong>Zijie (Jay) Wang</strong></a>’s (Ph.D. ML-CSE 2024) work continues to resonate with researchers.</p><p>Wang is a recipient of the&nbsp;<a href="https://medium.com/sigchi/announcing-the-2025-acm-sigchi-awards-17c1feaf865f"><strong>2025 Outstanding Dissertation Award from the Association for Computing Machinery Special Interest Group on Computer-Human Interaction (ACM SIGCHI)</strong></a>. The award recognizes Wang for his lifelong work on democratizing human-centered AI.</p><p>“Throughout my Ph.D. and industry internships, I observed a gap in existing research: there is a strong need for practical tools for applying human-centered approaches when designing AI systems,” said Wang, now a safety researcher at OpenAI.</p><p>“My work not only helps people understand AI and guide its behavior but also provides user-friendly tools that fit into existing workflows.”</p><p>[Related: <a href="https://sites.gatech.edu/research/chi-2025/">Georgia Tech College of Computing Swarms to Yokohama, Japan, for CHI 2025</a>]</p><p>Wang’s dissertation presented techniques in visual explanation and interactive guidance to align AI models with user knowledge and values. The work culminated from years of research, fellowship support, and internships.</p><p>Wang’s most influential projects formed the core of his dissertation. These included:</p><ul><li><a href="https://poloclub.github.io/cnn-explainer/"><strong>CNN Explainer</strong></a>: an open-source tool developed for deep-learning beginners. Since its release in July 2020, more than 436,000 global visitors have used the tool.</li><li><a href="https://poloclub.github.io/diffusiondb/"><strong>DiffusionDB</strong></a>: a first-of-its-kind large-scale dataset that lays a foundation to help people better understand generative AI. This work could lead to new research in detecting deepfakes and designing human-AI interaction tools to help people more easily use these models.</li><li><a href="https://interpret.ml/gam-changer/"><strong>GAM Changer</strong></a>: an interface that empowers users in healthcare, finance, or other domains to edit ML models to include knowledge and values specific to their domain, which improves reliability.</li><li><a href="https://www.jennwv.com/papers/gamcoach.pdf"><strong>GAM Coach</strong></a>: an interactive ML tool that could help people who have been rejected for a loan by automatically letting an applicant know what is needed for them to receive loan approval. </li><li><a href="https://www.cc.gatech.edu/news/new-tool-teaches-responsible-ai-practices-when-using-large-language-models"><strong>Farsight</strong></a>: a tool that alerts developers when they write prompts in large language models that could be harmful and misused. &nbsp;</li></ul><p>“I feel extremely honored and lucky to receive this award, and I am deeply grateful to many who have supported me along the way, including Polo, mentors, collaborators, and friends,” said Wang, who was advised by School of Computational Science and Engineering (CSE) Professor&nbsp;<a href="https://poloclub.github.io/polochau/"><strong>Polo Chau</strong></a>.</p><p>“This recognition also inspired me to continue striving to design and develop easy-to-use tools that help everyone to easily interact with AI systems.”</p><p>Like Wang, Chau advised Georgia Tech alumnus&nbsp;<a href="https://fredhohman.com/">Fred Hohman</a> (Ph.D. CSE 2020).&nbsp;<a href="https://www.cc.gatech.edu/news/alumnus-building-legacy-through-dissertation-and-mentorship">Hohman won the ACM SIGCHI Outstanding Dissertation Award in 2022</a>.</p><p><a href="https://poloclub.github.io/">Chau’s group</a> synthesizes machine learning (ML) and visualization techniques into scalable, interactive, and trustworthy tools. These tools increase understanding and interaction with large-scale data and ML models.&nbsp;</p><p>Chau is the associate director of corporate relations for the Machine Learning Center at Georgia Tech. Wang called the School of CSE his home unit while a student in the ML program under Chau.</p><p>Wang is one of five recipients of this year’s award to be presented at the 2025 Conference on Human Factors in Computing Systems (<a href="https://chi2025.acm.org/">CHI 2025</a>). The conference occurs April 25-May 1 in Yokohama, Japan.&nbsp;</p><p>SIGCHI is the world’s largest association of human-computer interaction professionals and practitioners. The group sponsors or co-sponsors 26 conferences, including CHI.</p><p>Wang’s outstanding dissertation award is the latest recognition of a career decorated with achievement.</p><p>Months after graduating from Georgia Tech,&nbsp;<a href="https://www.cc.gatech.edu/news/research-ai-safety-lands-recent-graduate-forbes-30-under-30">Forbes named Wang to its 30 Under 30 in Science for 2025</a> for his dissertation. Wang was one of 15 Yellow Jackets included in nine different 30 Under 30 lists and the only Georgia Tech-affiliated individual on the 30 Under 30 in Science list.</p><p>While a Georgia Tech student, Wang earned recognition from big names in business and technology. He received the&nbsp;<a href="https://www.cc.gatech.edu/news/student-named-apple-scholar-connecting-people-machine-learning">Apple Scholars in AI/ML Ph.D. Fellowship in 2023</a> and was in the&nbsp;<a href="https://www.cc.gatech.edu/news/georgia-tech-machine-learning-students-earn-jp-morgan-ai-phd-fellowships">2022 cohort of the J.P. Morgan AI Ph.D. Fellowships Program</a>.</p><p>Along with the CHI award, Wang’s dissertation earned him awards this year at banquets across campus. The&nbsp;<a href="https://bpb-us-e1.wpmucdn.com/sites.gatech.edu/dist/0/283/files/2025/03/2025-Sigma-Xi-Research-Award-Winners.pdf">Georgia Tech chapter of Sigma Xi presented Wang with the Best Ph.D. Thesis Award</a>. He also received the College of Computing’s Outstanding Dissertation Award.</p><p>“Georgia Tech attracts many great minds, and I’m glad that some, like Jay, chose to join our group,” Chau said. “It has been a joy to work alongside them and witness the many wonderful things they have accomplished, and with many more to come in their careers.”</p>]]></body>  <author>Bryant Wine</author>  <status>1</status>  <created>1745331886</created>  <gmt_created>2025-04-22 14:24:46</gmt_created>  <changed>1745332147</changed>  <gmt_changed>2025-04-22 14:29:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[ Zijie (Jay) Wang (Ph.D. ML-CSE 2024) is a recipient of the 2025 Outstanding Dissertation Award from the Association for Computing Machinery Special Interest Group on Computer-Human Interaction (ACM SIGCHI).]]></teaser>  <type>news</type>  <sentence><![CDATA[ Zijie (Jay) Wang (Ph.D. ML-CSE 2024) is a recipient of the 2025 Outstanding Dissertation Award from the Association for Computing Machinery Special Interest Group on Computer-Human Interaction (ACM SIGCHI).]]></sentence>  <summary><![CDATA[<p>A Georgia Tech alum’s dissertation introduced ways to make artificial intelligence (AI) more accessible, interpretable, and accountable. Although it’s been a year since his doctoral defense,&nbsp;<a href="https://zijie.wang/"><strong>Zijie (Jay) Wang</strong></a>’s (Ph.D. ML-CSE 2024) work continues to resonate with researchers.</p><p>Wang is a recipient of the&nbsp;<a href="https://medium.com/sigchi/announcing-the-2025-acm-sigchi-awards-17c1feaf865f"><strong>2025 Outstanding Dissertation Award from the Association for Computing Machinery Special Interest Group on Computer-Human Interaction (ACM SIGCHI)</strong></a>. The award recognizes Wang for his lifelong work on democratizing human-centered AI.</p>]]></summary>  <dateline>2025-04-17T00:00:00-04:00</dateline>  <iso_dateline>2025-04-17T00:00:00-04:00</iso_dateline>  <gmt_dateline>2025-04-17 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Bryant Wine, Communications Officer<br><a href="mailto:bryant.wine@cc.gatech.edu">bryant.wine@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>676903</item>          <item>673947</item>      </media>  <hg_media>          <item>          <nid>676903</nid>          <type>image</type>          <title><![CDATA[Jay-Wang-SIGCHI-Dissertation-Award.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Jay-Wang-SIGCHI-Dissertation-Award.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/04/22/Jay-Wang-SIGCHI-Dissertation-Award.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/04/22/Jay-Wang-SIGCHI-Dissertation-Award.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/04/22/Jay-Wang-SIGCHI-Dissertation-Award.jpg?itok=BwjW7CxH]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Zijie (Jay) Wang CHI 2025]]></image_alt>                    <created>1745331896</created>          <gmt_created>2025-04-22 14:24:56</gmt_created>          <changed>1745331896</changed>          <gmt_changed>2025-04-22 14:24:56</gmt_changed>      </item>          <item>          <nid>673947</nid>          <type>image</type>          <title><![CDATA[Farsight CHI.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Farsight CHI.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/05/05/Farsight%20CHI.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/05/05/Farsight%20CHI.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/05/05/Farsight%2520CHI.jpg?itok=hWo1VxQt]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[CHI 2024 Farsight]]></image_alt>                    <created>1714954253</created>          <gmt_created>2024-05-06 00:10:53</gmt_created>          <changed>1714954253</changed>          <gmt_changed>2024-05-06 00:10:53</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.cc.gatech.edu/news/thesis-human-centered-ai-earns-honors-international-computing-organization]]></url>        <title><![CDATA[Thesis on Human-Centered AI Earns Honors from International Computing Organization]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50877"><![CDATA[School of Computational Science and Engineering]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="155"><![CDATA[Congressional Testimony]]></category>          <category tid="143"><![CDATA[Digital Media and Entertainment]]></category>          <category tid="131"><![CDATA[Economic Development and Policy]]></category>          <category tid="42911"><![CDATA[Education]]></category>          <category tid="144"><![CDATA[Energy]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="42921"><![CDATA[Exhibitions]]></category>          <category tid="42891"><![CDATA[Georgia Tech Arts]]></category>          <category tid="179356"><![CDATA[Industrial Design]]></category>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="132"><![CDATA[Institute Leadership]]></category>          <category tid="194248"><![CDATA[International Education]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>          <category tid="42931"><![CDATA[Performances]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="151"><![CDATA[Policy, Social Sciences, and Liberal Arts]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>          <category tid="133"><![CDATA[Special Events and Guest Speakers]]></category>          <category tid="193157"><![CDATA[Student Honors and Achievements]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="155"><![CDATA[Congressional Testimony]]></term>          <term tid="143"><![CDATA[Digital Media and Entertainment]]></term>          <term tid="131"><![CDATA[Economic Development and Policy]]></term>          <term tid="42911"><![CDATA[Education]]></term>          <term tid="144"><![CDATA[Energy]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="42921"><![CDATA[Exhibitions]]></term>          <term tid="42891"><![CDATA[Georgia Tech Arts]]></term>          <term tid="179356"><![CDATA[Industrial Design]]></term>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="132"><![CDATA[Institute Leadership]]></term>          <term tid="194248"><![CDATA[International Education]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="148"><![CDATA[Music and Music Technology]]></term>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>          <term tid="42931"><![CDATA[Performances]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="151"><![CDATA[Policy, Social Sciences, and Liberal Arts]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>          <term tid="133"><![CDATA[Special Events and Guest Speakers]]></term>          <term tid="193157"><![CDATA[Student Honors and Achievements]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="166983"><![CDATA[School of Computational Science and Engineering]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="9153"><![CDATA[Research Horizons]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="680875">  <title><![CDATA[Securing Tomorrow’s Autonomous Robots Today]]></title>  <uid>32045</uid>  <body><![CDATA[<p>Every year, people in California risk their lives battling wildfires, but in the future, machines powered by artificial intelligence will be on the front lines, not firefighters.</p><p>However, this new generation of self-thinking robots will need security protocols to ensure they aren’t susceptible to hackers. To integrate such robots into society, they must come with assurances that they will behave safely around humans.</p><p>It begs the question: can you guarantee the safety of something that doesn’t exist yet? It’s something Assistant Professor <a href="https://glenchou.github.io/"><strong>Glen Chou</strong></a> hopes to accomplish by developing algorithms that will enable autonomous systems to learn and adapt while acting with safety and security assurances.</p><p>He plans to launch research initiatives, in collaboration with the <a href="https://scp.cc.gatech.edu/"><strong>School of Cybersecurity and Privacy</strong></a> and the <a href="https://ae.gatech.edu/"><strong>Daniel Guggenheim School of Aerospace Engineering</strong></a>, to secure this new technological frontier as it develops.</p><p>“To operate in uncertain real-world environments, robots and other autonomous systems need to leverage and adapt a complex network of perception and control algorithms to turn sensor data into actions,” he said. “To obtain realistic assurances, we must do a joint safety and security analysis on these sensors and algorithms simultaneously, rather than one at a time.”</p><p>This end-to-end method would proactively look for flaws in the robot’s systems rather than wait for them to be exploited. This would lead to intrinsically robust robotic systems that can recover from failures.</p><p><a href="https://www.cc.gatech.edu/news/new-algorithm-teaches-robots-through-human-perspective">[RELATED: New Algorithm Teaches Robots Through Human Perspective]</a></p><p>Chou said this research will be helpful in other domains, including advanced space exploration. If a space rover is sent to one of Saturn’s moons, for example, it needs to be able to act and think independently of scientists on Earth.&nbsp;</p><p>Aside from fighting fires and exploring space, this technology could perform maintenance in nuclear reactors, automatically maintain the power grid, and make autonomous surgery safer. It could also bring assistive robots into the home, enabling higher standards of care.&nbsp;</p><p>This is a challenging domain where safety, security, and privacy concerns are paramount due to frequent, close contact with humans.</p><p>This will start in the newly established <a href="https://trustworthyrobotics.github.io/"><strong>Trustworthy Robotics Lab</strong></a> at Georgia Tech, which Chou directs. He and his Ph.D. students will design principled algorithms that enable general-purpose robots and autonomous systems to operate capably, safely, and securely with humans while remaining resilient to real-world failures and uncertainty.</p><p>Chou earned dual bachelor’s degrees in electrical engineering and computer sciences as well as mechanical engineering from the University of California, Berkeley, in 2017, a master’s and Ph.D. in electrical and computer engineering from the University of Michigan in 2019 and 2022, respectively.&nbsp;</p><p>He was a postdoc at the Massachusetts Institute of Technology Computer Science &amp; Artificial Intelligence Laboratory before joining Georgia Tech in November 2024. He received the National Defense Science and Engineering Graduate fellowship program, NSF Graduate Research fellowships, and was named a Robotics: Science and Systems Pioneer in 2022.</p>]]></body>  <author>Ben Snedeker</author>  <status>1</status>  <created>1741107318</created>  <gmt_created>2025-03-04 16:55:18</gmt_created>  <changed>1742951908</changed>  <gmt_changed>2025-03-26 01:18:28</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The Trustworthy Robotics Lab enables robots and autonomous systems to operate safely with humans while remaining resilient to real-world challenges.]]></teaser>  <type>news</type>  <sentence><![CDATA[The Trustworthy Robotics Lab enables robots and autonomous systems to operate safely with humans while remaining resilient to real-world challenges.]]></sentence>  <summary><![CDATA[<p>The Trustworthy Robotics Lab is a new interdisciplinary venture led by School of Cybersecurity &amp; Privacy Assistant Professor <strong>Glen</strong> <strong>Chou</strong>. The lab's mission is to enable robots and autonomous systems to operate safely with humans while remaining resilient to real-world challenges.</p>]]></summary>  <dateline>2025-03-04T00:00:00-05:00</dateline>  <iso_dateline>2025-03-04T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-03-04 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>J.P. Popham, Communications Officer</p><p>Georgia Tech</p><p>School of Cybersecurity &amp; Privacy</p><p>john.popham@cc.gatech.edu</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>676448</item>      </media>  <hg_media>          <item>          <nid>676448</nid>          <type>image</type>          <title><![CDATA[Georgia Tech Assistant Professor Glen Chou with the School of Cybersecurity and Privacy works through an equation on a transparent writing board.]]></title>          <body><![CDATA[<p>Assistant Professor <a href="https://glenchou.github.io/"><strong>Glen Chou</strong></a> is launching research initiatives to develop algorithms enabling autonomous systems to learn and adapt while acting with safety and security assurances. Photo by Terence Rushin, College of Computing</p>]]></body>                      <image_name><![CDATA[Glen-Header-Image.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/03/04/Glen-Header-Image.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/03/04/Glen-Header-Image.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/03/04/Glen-Header-Image.jpeg?itok=D2iJwmEm]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Georgia Tech Assistant Professor Glen Chou with the School of Cybersecurity and Privacy works through an equation on a transparent writing board.]]></image_alt>                    <created>1741107406</created>          <gmt_created>2025-03-04 16:56:46</gmt_created>          <changed>1741107406</changed>          <gmt_changed>2025-03-04 16:56:46</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="78271"><![CDATA[IRIM]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="145171"><![CDATA[Cybersecurity]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="680735">  <title><![CDATA[New Algorithms Developed at Georgia Tech are Lunar Bound]]></title>  <uid>34736</uid>  <body><![CDATA[<p>In the past five years, five lunar landers have launched into space, marking a series of first successful landings in decades. The future will see more of these type of missions, including <a href="https://www.nasa.gov/humans-in-space/artemis/"><strong>NASA’s Artemis program</strong></a> and various private ventures. These missions need reliable and quick navigation abilities to successfully complete missions, especially if ground stations on Earth are overburdened or disconnected.&nbsp;</p><p>Georgia Tech’s <a href="https://seal.ae.gatech.edu/"><strong>Space Exploration and Analysis Laboratory</strong></a> (SEAL) has developed new algorithms that are headed to the Moon, as part of the <a href="https://www.intuitivemachines.com/im-2"><strong>Intuitive Machine’s</strong></a> IM-2 mission. The mission is sending a Nova-C class lunar lander named Athena to the Moon’s south pole region to test technologies and collect data that aim to enable future exploration. The mission is part of <a href="https://www.nasa.gov/commercial-lunar-payload-services/"><strong>NASA’s Commercial Lunar Payload Services</strong></a> (CLPS) initiative.</p><div><div><h3><strong>SEAL’s Space Odyssey&nbsp;</strong></h3></div></div><div><div><p>SEAL, led by AE professor <a href="https://ae.gatech.edu/directory/person/john-christian"><strong>John Christian</strong></a>, collaborated with Intuitive Machines to develop algorithms to guide Athena to the Shackleton crater: a region known for its limited sunlight and cold temperatures. In coordination with <a href="https://www.spacex.com/"><strong>SpaceX</strong></a>, launch of the company’s IM-2 mission is targeted for a multi-day launch window that opens no earlier than February 26 from Launch Complex 39A at NASA’s Kennedy Space Center in Florida.&nbsp;</p><p>Athena will transport NASA's<strong>&nbsp;</strong><a href="https://www.nasa.gov/mission/polar-resources-ice-mining-experiment-1-prime-1/"><strong>PRIME-1</strong></a> (Polar Resources Ice Mining Experiment-1) which includes two instruments: a drill and spectrometer. The Regolith and Ice Drill for Exploring New Terrain (TRIDENT) is designed to drill up to three feet of lunar surface to extract soil, while the mass spectrometer (MSOLO) will measure the amount of ice in the soil samples.&nbsp;</p><p>After launch, Athena will separate from the rocket and begin a roughly five-to-four-day cruise to the Moon’s orbit. The lander will orbit the Moon for approximately three to 1.5 days before its descent to the south pole.&nbsp;</p><p>In Fall 2022, Research Engineer <strong>Ava Thrasher&nbsp;</strong>(AE 2022, M.S. AE 2024)<strong>&nbsp;</strong>began working on IM-2, developing new algorithms to guide Athena to the Shackleton crater using optical terrain relative navigation (TRN). Her approach looked at developing a crater detection algorithm (CDA) using image processing techniques that capture crater center locations on the Moon which are then used to determine Athena's position estimations.&nbsp;</p><p>Then, she developed a crater identification algorithm (CIA) to match craters found in the image to a catalog of known lunar craters. By using CDA and CIA in tandem, Athena is able to estimate its location and orientation with a single photo, autonomously, and in real-time.&nbsp;</p><p>“We wanted to strike a balance between creating something that would be done quickly on board, but also something that was reliable,” she explained. “We ended up using simple crater geometry and knowledge of the sun angle to render what we expect a crater to look like in the image.”&nbsp;</p><p>The CDA finds craters by calculating a similarity score between the image and the rendered crater at each image pixel point. This process, also known as template matching, marks crater centers at points of very high similarity. CIA then uses these crater center locations to match them with known craters in a catalog. By matching pixel locations in an image to known three-dimensional positions on the Moon, the spacecraft is able to produce an estimation of its position.&nbsp;</p><p>After two years of research and testing, Thrasher, Christian, and the Intuitive Machines team successfully demonstrated the CDA and CIA on synthetic imagery and Thrasher handed off the algorithms to Intuitive Machines to convert them into flight software for Athena.&nbsp;</p><p>She first got involved with optical navigation (OPNAV) research after she took AE 4342: Senior Design with Prof. Christian as an undergraduate student. “I found optical navigation to be really interesting. I liked the idea of being able to figure out where you are and how you’re moving in real-time based on a picture,” she said. In Fall 2022, she started her first graduate semester at Tech and was a new member of SEAL, where she quickly began demonstrating the idea of detecting craters and prototyping the CDA and CIA programmed into Athena. &nbsp;</p><p>After she graduated with her master’s degree in aerospace engineering in May 2024, &nbsp;she loved what she did so much, that she decided to stay and work as a full-time research engineer in SEAL. Now, she’s gearing up to see her work make its way to the Moon.</p><p>“It's been really exciting and humbling to contribute to the massive task of putting a lander on the Moon. I never really appreciated the scale of work and collaboration needed to make it happen until I was lucky enough to be a part of it. I'll certainly be watching the launch and tracking the mission with great anticipation of both the engineering and scientific results,” said Thrasher.&nbsp;</p><div><div><h3><strong>IM-1 Makes History</strong></h3></div></div><div><div><p>As part of a multi-year collaboration, Christian helped <a href="https://www.ae.gatech.edu/news/2024/02/georgia-tech-algorithm-headed-moon"><strong>develop a key navigation algorithm for Intuitive Machines’ first space mission (IM-1</strong></a>) which launched a Nova-C lunar lander named Odysseus to the Malapert A crater on the Moon’s south pole region; about 11 miles away from IM-2’s targeted Shackleton crater.&nbsp;</p><p>The IM-1 mission launched from Kennedy Space Center on February 15, 2024 and soft-landed on the Moon on February 22, 2024---making Odysseus the first U.S. lunar landing since the Apollo program and the first-ever successful commercial lunar landing. Odysseus had a rougher-than-expected soft landing due to an anomaly with the altimeter that was supposed to provide insight into the lander’s height above the lunar surface. In the absence of these altimeter measurements, Odysseus relied critically on the visual odometry technique that was jointly developed by Christian and Intuitive Machines.&nbsp;</p></div></div><div><div><p>Despite these challenges, Odysseus captured images of the Moon during landing and operated on the lunar surface for 144 hours before entering standby mode.&nbsp;</p><p>Prof. Christian and SEAL have more projects on the horizon to develop new technologies for exploring our Moon, other planets, asteroids, and the solar system. These technologies will enable future scientific missions to safely explore challenging destinations and answer scientific questions that were impossible with yesterday’s technology.&nbsp;</p></div></div></div></div>]]></body>  <author>Kelsey Gulledge</author>  <status>1</status>  <created>1740586771</created>  <gmt_created>2025-02-26 16:19:31</gmt_created>  <changed>1740587259</changed>  <gmt_changed>2025-02-26 16:27:39</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[AE researchers have developed new algorithms to help Intuitive Machine’s lunar lander find water ice on the Moon.  ]]></teaser>  <type>news</type>  <sentence><![CDATA[AE researchers have developed new algorithms to help Intuitive Machine’s lunar lander find water ice on the Moon.  ]]></sentence>  <summary><![CDATA[<p>Georgia Tech’s <a href="https://seal.ae.gatech.edu/"><strong>Space Exploration and Analysis Laboratory</strong></a> (SEAL) has developed new algorithms that are headed to the Moon, as part of the <a href="https://www.intuitivemachines.com/im-2"><strong>Intuitive Machine’s</strong></a> IM-2 mission. The mission is sending a Nova-C class lunar lander named Athena to the Moon’s south pole region to test technologies and collect data that aim to enable future exploration. The mission is part of <a href="https://www.nasa.gov/commercial-lunar-payload-services/"><strong>NASA’s Commercial Lunar Payload Services</strong></a> (CLPS) initiative.</p><p>SEAL, led by Professor <strong>John Christian</strong>, collaborated with Intuitive Machines to develop algorithms to guide Athena to the Shackleton crater: a region known for its limited sunlight and cold temperatures. Research Engineer <strong>Ava Thrasher</strong> (AE 2022, M.S. AE 2024) led Georgia Tech's SEAL team on developing the algorithms used for Athena's flight software.&nbsp;</p>]]></summary>  <dateline>2025-02-25T00:00:00-05:00</dateline>  <iso_dateline>2025-02-25T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-02-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[<p><strong>LAUNCHING: February 26, 2025</strong></p><p><strong>6:30 p.m. EST </strong><a href="https://www.nasa.gov/news-release/nasa-sets-coverage-for-intuitive-machines-next-commercial-moon-launch/"><strong>launch coverage</strong></a><strong> begins&nbsp;</strong><br><strong>7:02-7:34 p.m. EST launch window</strong></p><p>Stream on <a href="https://plus.nasa.gov/scheduled-video/intuitive-machines-2-launch-to-the-moon/"><strong>NASA+</strong></a></p>]]></sidebar>  <email><![CDATA[kelsey.gulledge@aerospace.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Kelsey Gulledge</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>676397</item>          <item>676398</item>          <item>676399</item>          <item>676401</item>      </media>  <hg_media>          <item>          <nid>676397</nid>          <type>image</type>          <title><![CDATA[54284511327_9ca21c7337_o.jpg]]></title>          <body><![CDATA[<div><div><div><div><div><div><p>Intuitive Machines' IM-2 mission lunar lander, Athena, in the company's Lunar Production and Operations Center. Credit: Intuitive Machines</p></div></div></div></div></div></div><div><div><div><div><div><br> </div></div></div></div></div>]]></body>                      <image_name><![CDATA[54284511327_9ca21c7337_o.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/02/26/54284511327_9ca21c7337_o.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/02/26/54284511327_9ca21c7337_o.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/02/26/54284511327_9ca21c7337_o.jpg?itok=swWOgO_h]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Intuitive Machines' IM-2 mission lunar lander, Athena, in the company's Lunar Production and Operations Center. Credit: Intuitive Machines]]></image_alt>                    <created>1740586783</created>          <gmt_created>2025-02-26 16:19:43</gmt_created>          <changed>1740586783</changed>          <gmt_changed>2025-02-26 16:19:43</gmt_changed>      </item>          <item>          <nid>676398</nid>          <type>image</type>          <title><![CDATA[Christian-John.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Christian-John.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/02/26/Christian-John.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/02/26/Christian-John.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/02/26/Christian-John.jpg?itok=a2Mf1kZz]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Headshot of John Christian, AE School Professor]]></image_alt>                    <created>1740586840</created>          <gmt_created>2025-02-26 16:20:40</gmt_created>          <changed>1740586840</changed>          <gmt_changed>2025-02-26 16:20:40</gmt_changed>      </item>          <item>          <nid>676399</nid>          <type>image</type>          <title><![CDATA[HeadShotThrasher.JPG]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[HeadShotThrasher.JPG]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/02/26/HeadShotThrasher.JPG]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/02/26/HeadShotThrasher.JPG]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/02/26/HeadShotThrasher.JPG?itok=pmytxNcG]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Headshot of Ava Thrasher, AE School alumna and research engineer]]></image_alt>                    <created>1740586878</created>          <gmt_created>2025-02-26 16:21:18</gmt_created>          <changed>1740586878</changed>          <gmt_changed>2025-02-26 16:21:18</gmt_changed>      </item>          <item>          <nid>676401</nid>          <type>image</type>          <title><![CDATA[AAS_2024_CraterDetection_final-2.png]]></title>          <body><![CDATA[<div><div><div>Illustration of the steps used to detect and identify craters to ultimately determine the vehicles state estimation. Credit: Georgia Tech </div></div></div><div><br> </div>]]></body>                      <image_name><![CDATA[AAS_2024_CraterDetection_final-2.png]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/02/26/AAS_2024_CraterDetection_final-2.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/02/26/AAS_2024_CraterDetection_final-2.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/02/26/AAS_2024_CraterDetection_final-2.png?itok=NAZs3A2Z]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Illustration of the steps used to detect and identify craters to ultimately determine the vehicles state estimation. Credit: Georgia Tech ]]></image_alt>                    <created>1740587067</created>          <gmt_created>2025-02-26 16:24:27</gmt_created>          <changed>1740587067</changed>          <gmt_changed>2025-02-26 16:24:27</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="660364"><![CDATA[Aerospace Engineering]]></group>          <group id="1237"><![CDATA[College of Engineering]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="130"><![CDATA[Alumni]]></category>          <category tid="42911"><![CDATA[Education]]></category>          <category tid="144"><![CDATA[Energy]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="130"><![CDATA[Alumni]]></term>          <term tid="42911"><![CDATA[Education]]></term>          <term tid="144"><![CDATA[Energy]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="680585">  <title><![CDATA[New Algorithm Teaches Robots Through Human Perspective]]></title>  <uid>32045</uid>  <body><![CDATA[<p>A new data creation paradigm and algorithmic breakthrough from Georgia Tech has laid the groundwork for humanoid assistive robots to help with laundry, dishwashing, and other household chores. The framework enables these robots to learn new skills by mimicking actions from first-person videos of everyday activities.</p><p>Current training methods limit robots from being produced at the necessary scale to put a robot in every home, said <strong>Simar</strong> <strong>Kareer</strong>, a Ph.D. student in the School of Interactive Computing.</p><p>“Traditionally, collecting data for robotics means creating demonstration data,” Kareer said. “You operate the robot’s joints with a controller to move it and achieve the task you want, and you do this hundreds of times while recording sensor data, then train your models. This is slow and difficult. The only way to break that cycle is to detach the data collection from the robot itself.”</p><p><a href="https://youtu.be/ckGUsdFX9pU?si=7qmGR1D5P_iPAVMt"><strong>[VIDEO: Meta Shares EgoMimic Case Study Video]</strong></a></p><p>Other fields, such as computer vision and natural language processing (NLP), already leverage training data passively culled from the internet to create powerful generative AI and large-language models (LLMs).</p><p>Many roboticists, however, have shifted toward interventions that allow individual users to teach their robots how to perform tasks. Kareer believes a similar source of passive data can be established to enable practical generalized training that scales the production of humanoid robots.</p><p>This is why Kareer collaborated with School of IC Assistant Professor <strong>Danfei</strong> <strong>Xu</strong> and his <a href="https://rl2.cc.gatech.edu/"><strong>Robot Learning and Reasoning Lab</strong></a> to develop EgoMimic, an algorithmic framework that leverages data from egocentric videos.</p><p>Meta’s Ego4D dataset inspired Kareer’s project. The benchmark dataset, released in 2023, consists of first-person videos of humans performing daily activities. This open-source data set trains AI models from a first-person human perspective.</p><p>“When I looked at Ego4D, I saw a dataset that’s the same as all the large robot datasets we’re trying to collect, except it’s with humans,” Kareer said. “You just wear a pair of glasses, and you go do things. It doesn’t need to come from the robot. It should come from something more scalable and passively generated, which is us.”</p><p>Kareer acquired a pair of Meta’s Project Aria research glasses, which contain a rich sensor suite and can record video from a first-person perspective through external RGB and SLAM cameras.</p><p>Kareer recorded himself folding a shirt while wearing the glasses and repeated the process. He did the same with other tasks such as placing a toy in a bowl and groceries into a bag. Then, he constructed a humanoid robot with pincers for hands and attached the glasses to the top to mimic a first-person viewpoint.</p><p>The robot performed each task repeatedly for two hours. Kareer said building a traditional training algorithm would take days of teleoperating and recording robot sensory data. For his project, he only needed to gather a baseline of sensory data to ensure performance improvement.&nbsp;</p><p>Kareer bridged the gap between the two training sets with the EgoMimic algorithm. The robot’s task performance rating increased by as much as 400% among various tasks with just 90 minutes of recorded footage. It also showed the ability to perform these tasks in unseen environments.</p><p>If enough people wear Aria glasses or other smart glasses while performing daily tasks, it can create the passive data bank needed to train robots on a massive scale.</p><p>This type of data collection can enable nearly endless possibilities for roboticists to help humans achieve more in their everyday lives. Humanoid robots can be produced and trained at an industrial level and be able to perform tasks the same way humans do.</p><p>“This work is most applicable to jobs that you can get a humanoid robot to do,” Kareer said. “In whatever industry we are allowed to collect egocentric data, we can develop humanoid robots.”</p><p>Kareer will present his paper on EgoMimic at the 2025 IEEE Engineers’ International Conference on Robotics and Automation (ICRA), which will take place from May 19 to 23 in Atlanta. The paper was co-authored by Xu and School of IC Assistant Professor <strong>Judy</strong> <strong>Hoffman</strong>, fellow Tech students <strong>Dhruv</strong> <strong>Patel</strong>, <strong>Ryan</strong> <strong>Punamiya</strong>, <strong>Pranay</strong> <strong>Mathur</strong>, and <strong>Shuo</strong> <strong>Cheng</strong>, and <strong>Chen</strong> <strong>Wang</strong>, a Ph.D. student at Stanford.</p>]]></body>  <author>Ben Snedeker</author>  <status>1</status>  <created>1739977213</created>  <gmt_created>2025-02-19 15:00:13</gmt_created>  <changed>1739996446</changed>  <gmt_changed>2025-02-19 20:20:46</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Inspired by a dataset created by Meta, a Georgia Tech Ph.D. student is bringing a new perspective to robotics training.]]></teaser>  <type>news</type>  <sentence><![CDATA[Inspired by a dataset created by Meta, a Georgia Tech Ph.D. student is bringing a new perspective to robotics training.]]></sentence>  <summary><![CDATA[<p>Inspired by a dataset created by Meta, a Georgia Tech Ph.D. student is bringing a new perspective to robotics training.</p>]]></summary>  <dateline>2025-02-19T00:00:00-05:00</dateline>  <iso_dateline>2025-02-19T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-02-19 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Ben Snedeker, Communication Manager</p><p>Georgia Tech College of Computing</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>676332</item>      </media>  <hg_media>          <item>          <nid>676332</nid>          <type>image</type>          <title><![CDATA[Georgia Tech Ph.D. student Simar Kareer is revolutionizing how robots are trained.]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Simar Kareer_86A7668 (1).jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/02/19/Simar%20Kareer_86A7668%20%281%29.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/02/19/Simar%20Kareer_86A7668%20%281%29.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/02/19/Simar%2520Kareer_86A7668%2520%25281%2529.jpg?itok=JwZua-cA]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Georgia Tech Ph.D. student Simar Kareer is revolutionizing how robots are trained.]]></image_alt>                    <created>1739977597</created>          <gmt_created>2025-02-19 15:06:37</gmt_created>          <changed>1739977597</changed>          <gmt_changed>2025-02-19 15:06:37</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://youtu.be/ckGUsdFX9pU?si=b-J_aUjaDNpMpq2b]]></url>        <title><![CDATA[Project Aria Case Study: Introducing EgoMimic by the Georgia Institute of Technology]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="181991"><![CDATA[Georgia Tech News Center]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="680526">  <title><![CDATA[Securing Tomorrow’s Autonomous Robots Today]]></title>  <uid>36253</uid>  <body><![CDATA[<p>Men and women in California put their lives on the line when battling wildfires every year, but there is a future where machines powered by artificial intelligence are on the front lines, not firefighters.</p><p>However, this new generation of self-thinking robots would need security protocols to ensure they aren’t susceptible to hackers. To integrate such robots into society, they must come with assurances that they will behave safely around humans.</p><p>It begs the question: can you guarantee the safety of something that doesn’t exist yet? It’s something Assistant Professor Glen Chou hopes to accomplish by developing algorithms that will enable autonomous systems to learn and adapt while acting with safety and security assurances.&nbsp;</p><p>He plans to launch research initiatives, in collaboration with the School of Cybersecurity and Privacy and the Daniel Guggenheim School of Aerospace Engineering, to secure this new technological frontier as it develops.&nbsp;</p><p>“To operate in uncertain real-world environments, robots and other autonomous systems need to leverage and adapt a complex network of perception and control algorithms to turn sensor data into actions,” he said. “To obtain realistic assurances, we must do a joint safety and security analysis on these sensors and algorithms simultaneously, rather than one at a time.”</p><p>This end-to-end method would proactively look for flaws in the robot’s systems rather than wait for them to be exploited. This would lead to intrinsically robust robotic systems that can recover from failures.</p><p>Chou said this research will be useful in other domains, including advanced space exploration. If a space rover is sent to one of Saturn’s moons, for example, it needs to be able to act and think independently of scientists on Earth.&nbsp;</p><p>Aside from fighting fires and exploring space, this technology could perform maintenance in nuclear reactors, automatically maintain the power grid, and make autonomous surgery safer. It could also bring assistive robots into the home, enabling higher standards of care.&nbsp;</p><p>This is a challenging domain where safety, security, and privacy concerns are paramount due to frequent, close contact with humans.</p><p>This will start in the newly established Trustworthy Robotics Lab at Georgia Tech, which Chou directs. He and his Ph.D. students will design principled algorithms that enable general-purpose robots and autonomous systems to operate capably, safely, and securely with humans while remaining resilient to real-world failures and uncertainty.</p><p>Chou earned dual bachelor’s degrees in electrical engineering and computer sciences as well as mechanical engineering from University of California Berkeley in 2017, a master’s and Ph.D. in electrical and computer engineering from the University of Michigan in 2019 and 2022, respectively. He was a postdoc at MIT Computer Science &amp; Artificial Intelligence Laboratory prior to joining Georgia Tech in November 2024. He is a recipient of the National Defense Science and Engineering Graduate fellowship program, NSF Graduate Research fellowships, and was named a Robotics: Science and Systems Pioneer in 2022.</p>]]></body>  <author>John Popham</author>  <status>1</status>  <created>1739799760</created>  <gmt_created>2025-02-17 13:42:40</gmt_created>  <changed>1739800381</changed>  <gmt_changed>2025-02-17 13:53:01</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Assistant Professor Glen Chou is leading research to ensure the security and safety of future autonomous robots, which could one day fight wildfires, explore space, and assist in critical environments like nuclear reactors and hospitals.]]></teaser>  <type>news</type>  <sentence><![CDATA[Assistant Professor Glen Chou is leading research to ensure the security and safety of future autonomous robots, which could one day fight wildfires, explore space, and assist in critical environments like nuclear reactors and hospitals.]]></sentence>  <summary><![CDATA[<p>Assistant Professor Glen Chou is leading research to ensure the security and safety of future autonomous robots, which could one day fight wildfires, explore space, and assist in critical environments like nuclear reactors and hospitals. His work at Georgia Tech’s Trustworthy Robotics Lab focuses on developing algorithms that allow robots to learn, adapt, and operate securely in uncertain real-world conditions. By integrating safety and security analyses, Chou aims to create resilient robotic systems that can proactively address vulnerabilities. His research, conducted in collaboration with cybersecurity and aerospace engineering experts, could revolutionize autonomous technology across multiple domains.</p>]]></summary>  <dateline>2025-02-14T00:00:00-05:00</dateline>  <iso_dateline>2025-02-14T00:00:00-05:00</iso_dateline>  <gmt_dateline>2025-02-14 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jpopham3@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<div><p>John (JP) Popham&nbsp;<br>Communications Officer II&nbsp;<br>College of Computing | School of Cybersecurity and Privacy</p></div>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>676301</item>      </media>  <hg_media>          <item>          <nid>676301</nid>          <type>image</type>          <title><![CDATA[Glen Header Image.jpeg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Glen Header Image.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/2025/02/17/Glen%20Header%20Image.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/02/17/Glen%20Header%20Image.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2025/02/17/Glen%2520Header%2520Image.jpeg?itok=RpD7xXA_]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Man writing on glass with a marker ]]></image_alt>                    <created>1739799782</created>          <gmt_created>2025-02-17 13:43:02</gmt_created>          <changed>1739799782</changed>          <gmt_changed>2025-02-17 13:43:02</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="660367"><![CDATA[School of Cybersecurity and Privacy]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="187991"><![CDATA[go-robotics]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="188776"><![CDATA[go-research]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="182941"><![CDATA[cc-research; ic-cybersecurity; ic-hcc]]></keyword>          <keyword tid="1404"><![CDATA[Cybersecurity]]></keyword>          <keyword tid="181920"><![CDATA[cc-research; ic-ai-ml; ic-robotics]]></keyword>          <keyword tid="182191"><![CDATA[areospace systems analysis]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="145171"><![CDATA[Cybersecurity]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>          <term tid="193657"><![CDATA[Space Research Initiative]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="675467">  <title><![CDATA[Using Deep Learning Techniques to Improve Liver Disease Diagnosis and Treatment]]></title>  <uid>27863</uid>  <body><![CDATA[<p>Hepatic, or liver, disease affects more than 100 million people in the U.S. About 4.5 million adults (1.8%) have been diagnosed with liver disease, but it is estimated that between 80 and 100 million adults in the U.S. have undiagnosed fatty liver disease in varying stages. Over time, undiagnosed and untreated hepatic diseases can lead to cirrhosis, a severe scarring of the liver that cannot be reversed.&nbsp;</p><p>Most hepatic diseases are chronic conditions that will be present over the life of the patient, but early detection improves overall health and the ability to manage specific conditions over time. Additionally, assessing patients over time allows for effective treatments to be adjusted as necessary. The standard protocol for diagnosis, as well as follow-up tissue assessment, is a biopsy after the return of an abnormal blood test, but biopsies are time-consuming and pose risks for the patient. Several non-invasive imaging techniques have been developed to assess the stiffness of liver tissue, an indication of scarring, including magnetic resonance elastography (MRE).</p><p>MRE combines elements of ultrasound and MRI imaging to create a visual map showing gradients of stiffness throughout the liver and is increasingly used to diagnose hepatic issues. MRE exams, however, can fail for many reasons, including patient motion, patient physiology, imaging issues, and mechanical issues such as improper wave generation or propagation in the liver. Determining the success of MRE exams depends on visual inspection of technologists and radiologists. With increasing work demands and workforce shortages, providing an accurate, automated way to classify image quality will create a streamlined approach and reduce the need for repeat scans.&nbsp;</p><p>Professor&nbsp;<a href="https://www.biorobotics.gatech.edu/wp/">Jun Ueda</a> in the George W. Woodruff School of Mechanical Engineering and robotics Ph.D. student Heriberto Nieves, working with a team from the Icahn School of Medicine at Mount Sinai, have successfully applied deep learning techniques for accurate, automated quality control image assessment. The research,&nbsp;<a href="https://onlinelibrary.wiley.com/doi/10.1002/jmri.29490">“Deep Learning-Enabled Automated Quality Control for Liver MR Elastography: Initial Results,”</a> was published in the<em> Journal of Magnetic Resonance Imaging</em>.</p><p>Using five deep learning training models, an accuracy of 92% was achieved by the best-performing ensemble on retrospective MRE images of patients with varied liver stiffnesses. The team also achieved a return of the analyzed data within seconds. The rapidity of image quality return allows the technician to focus on adjusting hardware or patient orientation for re-scan in a single session, rather than requiring patients to return for costly and timely re-scans due to low-quality initial images.</p><p>This new research is a step toward streamlining the review pipeline for MRE using deep learning techniques, which have remained unexplored compared to other medical imaging modalities.&nbsp; The research also provides a helpful baseline for future avenues of inquiry, such as assessing the health of the spleen or kidneys. It may also be applied to automation for image quality control for monitoring non-hepatic conditions, such as breast cancer or muscular dystrophy, in which tissue stiffness is an indicator of initial health and disease progression. Ueda, Nieves, and their team hope to test these models on Siemens Healthineers magnetic resonance scanners within the next year.</p><p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</p><p><strong>Publication</strong><br>Nieves-Vazquez, H.A., Ozkaya, E., Meinhold, W., Geahchan, A., Bane, O., Ueda, J. and Taouli, B. (2024), Deep Learning-Enabled Automated Quality Control for Liver MR Elastography: Initial Results. J Magn Reson Imaging.&nbsp;<a href="https://doi.org/10.1002/jmri.29490">https://doi.org/10.1002/jmri.29490</a></p><p><strong>Prior Work</strong>&nbsp;<br><a href="https://research.gatech.edu/robotically-precise-diagnostics-and-therapeutics-degenerative-disc-disorder">Robotically Precise Diagnostics and Therapeutics for Degenerative Disc Disorder</a></p><p><strong>Related Material</strong><br><a href="https://onlinelibrary.wiley.com/doi/10.1002/jmri.29492">Editorial for “Deep Learning-Enabled Automated Quality Control for Liver MR Elastography: Initial Results”</a></p>]]></body>  <author>Christa Ernst</author>  <status>1</status>  <created>1721072004</created>  <gmt_created>2024-07-15 19:33:24</gmt_created>  <changed>1721229620</changed>  <gmt_changed>2024-07-17 15:20:20</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[With increasing work demands and workforce shortages, providing an accurate, automated way to classify image quality will create a streamlined approach and reduce the need for repeat scans. ]]></teaser>  <type>news</type>  <sentence><![CDATA[With increasing work demands and workforce shortages, providing an accurate, automated way to classify image quality will create a streamlined approach and reduce the need for repeat scans. ]]></sentence>  <summary><![CDATA[<p>Professor&nbsp;<a href="https://www.biorobotics.gatech.edu/wp/">Jun Ueda</a> in the George W. Woodruff School of Mechanical Engineering and robotics Ph.D. student Heriberto Nieves, working with a team from the Icahn School of Medicine at Mount Sinai, have successfully applied deep learning techniques for accurate, automated quality control image assessment.&nbsp;</p>]]></summary>  <dateline>2024-07-15T00:00:00-04:00</dateline>  <iso_dateline>2024-07-15T00:00:00-04:00</iso_dateline>  <gmt_dateline>2024-07-15 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[christa.ernst@research.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Christa M. Ernst |&nbsp;</p><p><strong>Research Communications Program Manager |&nbsp;</strong></p><p><strong>Topic Expertise: Robotics, Data Sciences, Semiconductor Design &amp; Fab |&nbsp;</strong></p><p><a href="https://research.gatech.edu/" rel="noopener noreferrer" target="_blank"><strong>Research @ the Georgia Institute of Technology</strong></a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>674351</item>      </media>  <hg_media>          <item>          <nid>674351</nid>          <type>image</type>          <title><![CDATA[Ueda MRE News]]></title>          <body><![CDATA[<p>Professor <a href="https://www.biorobotics.gatech.edu/wp/">Jun Ueda</a> in the George W. Woodruff School of Mechanical Engineering and robotics Ph.D. student Heriberto Nieves.</p>]]></body>                      <image_name><![CDATA[Heriberto and Ueda DL-MRE 6 half sized.png]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/07/15/Heriberto%20and%20Ueda%20DL-MRE%206%20half%20sized.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/07/15/Heriberto%20and%20Ueda%20DL-MRE%206%20half%20sized.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/07/15/Heriberto%2520and%2520Ueda%2520DL-MRE%25206%2520half%2520sized.png?itok=rAgP2eec]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Professor Jun Ueda in the George W. Woodruff School of Mechanical Engineering and robotics Ph.D. student Heriberto Nieves.]]></image_alt>                    <created>1721071536</created>          <gmt_created>2024-07-15 19:25:36</gmt_created>          <changed>1721071827</changed>          <gmt_changed>2024-07-15 19:30:27</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>          <group id="1292"><![CDATA[Parker H. Petit Institute for Bioengineering and Bioscience (IBB)]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>          <keyword tid="11689"><![CDATA[Institute for Bioengineeirng and Bioscience]]></keyword>          <keyword tid="594"><![CDATA[college of engineering]]></keyword>          <keyword tid="98751"><![CDATA[College of Engineering; George W. Woodruff School of Mechanical Engineering]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="9540"><![CDATA[Bioengineering and Bioscience]]></keyword>          <keyword tid="97611"><![CDATA[research news]]></keyword>          <keyword tid="188087"><![CDATA[go-irim]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="187423"><![CDATA[go-bio]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="675021">  <title><![CDATA[ Ph.D. Student Wins Best Paper at Robotics Conference]]></title>  <uid>36530</uid>  <body><![CDATA[<p>Ask a person to find a frying pan, and they will most likely go to the kitchen. Ask a robot to do the same, and you may get numerous responses, depending on how the robot is trained.</p><p>Since humans often associate objects in a home with the room they are in, Naoki Yokoyama thinks robots that navigate human environments to perform assistive tasks should mimic that reasoning.</p><p>Roboticists have employed natural language models to help robots mimic human reasoning over the past few years. However, Yokoyama, a Ph.D. student in robotics, said these models create a “bottleneck” that prevents agents from picking up on visual cues such as room type, size, décor, and lighting.&nbsp;</p><p>Yokoyama presented a new framework for semantic reasoning at the Institute of Electrical and Electronic Engineers (IEEE) <a href="https://www.ieee-ras.org/conferences-workshops/fully-sponsored/icra"><strong>International Conference on Robotics and Automation</strong></a> (ICRA) last month in Yokohama, Japan. ICRA is the world’s largest robotics conference.</p><p>Yokoyama earned a best paper award in the Cognitive Robotics category with his <a href="http://naoki.io/portfolio/vlfm"><strong>Vision-Language Frontier Maps (VLFM) proposal</strong></a>.</p><p>Assistant Professor Sehoon Ha and Associate Professor Dhruv Batra from the School of Interactive Computing advised Yokoyama on the paper. Yokoyama authored the paper while interning at the Boston Dynamics’ <a href="https://theaiinstitute.com/"><strong>AI Institute</strong></a>.</p><p>“I think the cognitive robotic category represents a significant portion of submissions to ICRA nowadays,” said Yokoyama, whose family is from Japan. “I’m grateful that our work is being recognized among the best in this field.”</p><p>Instead of natural language models, Yokoyama used a renowned vision-language model called BLIP-2 and tested it on a Boston Dynamics “Spot” robot in home and office environments.</p><p>“We rely on models that have been trained on vast amounts of data collected from the web,” Yokoyama said. “That allows us to use models with common sense reasoning and world knowledge. It’s not limited to a typical robot learning environment.”</p><h6><strong>What is Blip-2?</strong></h6><p>BLIP-2 matches images to text by assigning a score that evaluates how well the user input text describes the content of an image. The model removes the need for the robot to use object detectors and language models.&nbsp;</p><p>Instead, the robot uses BLIP-2 to extract semantic values from RGB images with a text prompt that includes the target object.&nbsp;</p><p>BLIP-2 then teaches the robot to recognize the room type, distinguishing the living room from the bathroom and the kitchen. The robot learns to associate certain objects with specific rooms where it will likely find them.</p><p>From here, the robot creates a value map to determine the most likely locations for a target object, Yokoyama said.</p><p>Yokoyama said this is a step forward for intelligent home assistive robots, enabling users to find objects — like missing keys — in their homes without knowing an item’s location.&nbsp;</p><p>“If you’re looking for a pair of scissors, the robot can automatically figure out it should head to the kitchen or the office,” he said. “Even if the scissors are in an unusual place, it uses semantic reasoning to work through each room from most probable location to least likely.”</p><p>He added that the benefit of using a VLM instead of an object detector is that the robot will include visual cues in its reasoning.</p><p>“You can look at a room in an apartment, and there are so many things an object detector wouldn’t tell you about that room that would be informative,” he said. “You don’t want to limit yourself to a textual description or a list of object classes because you’re missing many semantic visual cues.”</p><p>While other VLMs exist, Yokoyama chose BLIP-2 because the model:</p><ul><li>Accepts any text length and isn’t limited to a small set of objects or categories.</li><li>Allows the robot to be pre-trained on vast amounts of data collected from the internet.</li><li>Has proven results that enable accurate image-to-text matching.</li></ul><h6><strong>Home, Office, and Beyond</strong></h6><p>Yokoyama also tested the Spot robot to navigate a more challenging office environment. Office spaces tend to be more homogenous and harder to distinguish from one another than rooms in a home.&nbsp;</p><p>“We showed a few cases in which the robot will still work,” Yokoyama said. “We tell it to find a microwave, and it searches for the kitchen. We tell it to find a potted plant, and it moves toward an area with windows because, based on what it knows from BLIP-2, that’s the most likely place to find the plant.”</p><p>Yokoyama said as VLM models continue to improve, so will robot navigation. The increase in the number of VLM models has caused robot navigation to steer away from traditional physical simulations.</p><p>“It shows how important it is to keep an eye on the work being done in computer vision and natural language processing for getting robots to perform tasks more efficiently,” he said. “The current research direction in robot learning is moving toward more intelligent and higher-level reasoning. These foundation models are going to play a key role in that.”</p><p><em>Top photo by Kevin Beasley/College of Computing.</em></p>]]></body>  <author>Nathan Deen</author>  <status>1</status>  <created>1717684006</created>  <gmt_created>2024-06-06 14:26:46</gmt_created>  <changed>1717684832</changed>  <gmt_changed>2024-06-06 14:40:32</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Yokoyama presented a new framework for semantic reasoning for robots at the IEEE International Conference on Robotics and Automation, where he won best paper in the Cognitive Robotics category.]]></teaser>  <type>news</type>  <sentence><![CDATA[Yokoyama presented a new framework for semantic reasoning for robots at the IEEE International Conference on Robotics and Automation, where he won best paper in the Cognitive Robotics category.]]></sentence>  <summary><![CDATA[<p>Roboticists have employed natural language models to help robots mimic human reasoning over the past few years. However, Yokoyama, a Ph.D. student in robotics, said these models create a “bottleneck” that prevents agents from picking up on visual cues such as room type, size, décor, and lighting.&nbsp;</p><p>Yokoyama presented a new framework for semantic reasoning at the Institute of Electrical and Electronic Engineers (IEEE) <a href="https://www.ieee-ras.org/conferences-workshops/fully-sponsored/icra"><strong>International Conference on Robotics and Automation</strong></a> (ICRA) last month in Yokohama, Japan. ICRA is the world’s largest robotics conference.</p><p>Yokoyama earned a best paper award in the Cognitive Robotics category with his <a href="http://naoki.io/portfolio/vlfm"><strong>Vision-Language Frontier Maps (VLFM) proposal</strong></a>.</p>]]></summary>  <dateline>2024-06-06T00:00:00-04:00</dateline>  <iso_dateline>2024-06-06T00:00:00-04:00</iso_dateline>  <gmt_dateline>2024-06-06 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[ndeen6@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Nathan Deen</p><p>Communications Officer</p><p>School of Interactive Computing</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>674146</item>      </media>  <hg_media>          <item>          <nid>674146</nid>          <type>image</type>          <title><![CDATA[208A9469.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[208A9469.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/06/06/208A9469.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/06/06/208A9469.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/06/06/208A9469.jpg?itok=xIiN0P1I]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Three students kneeling around a spot robot]]></image_alt>                    <created>1717684031</created>          <gmt_created>2024-06-06 14:27:11</gmt_created>          <changed>1717684031</changed>          <gmt_changed>2024-06-06 14:27:11</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>          <category tid="193157"><![CDATA[Student Honors and Achievements]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>          <term tid="193157"><![CDATA[Student Honors and Achievements]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>      </news_terms>  <keywords>          <keyword tid="192863"><![CDATA[go-ai]]></keyword>          <keyword tid="187812"><![CDATA[artificial intelligence (AI)]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="193655"><![CDATA[Artificial Intelligence at Georgia Tech]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="674367">  <title><![CDATA[Why Can’t Robots Outrun Animals?]]></title>  <uid>35575</uid>  <body><![CDATA[<p>Robots that can run, jump, and even talk have shifted from the stuff of science fiction to reality in the past few decades. Yet even in robots specialized for specific movements like running, animals are still able to outmaneuver the most advanced robotic developments.&nbsp;</p><p>Georgia Tech’s <a href="https://physics.gatech.edu/user/simon-sponberg" rel="noreferrer noopener" target="_blank">Simon Sponberg</a> recently collaborated with researchers at the <a href="https://www.washington.edu/" rel="noreferrer noopener" target="_blank">University of Washington</a>, <a href="https://www.sfu.ca/" rel="noreferrer noopener" target="_blank">Simon Fraser University</a>, <a href="https://www.colorado.edu/" rel="noreferrer noopener" target="_blank">University of Colorado Boulder</a>, and <a href="https://www.sri.com/" rel="noreferrer noopener" target="_blank">Stanford Research Institute</a> to answer one deceptively complex question: Why can’t robots outrun animals?&nbsp;</p><p>“This work is about trying to understand how, despite have some really amazing robots, there still seems to be a gulf between the capabilities of animal movement and what we can engineer,” says Sponberg, who is Dunn Family Associate Professor in the <a href="https://physics.gatech.edu/" rel="noreferrer noopener" target="_blank">School of Physics</a> and <a href="https://biosciences.gatech.edu/" rel="noreferrer noopener" target="_blank">School of Biological Sciences</a>.&nbsp;</p><p>Recently published in <em><a href="https://www.science.org/doi/10.1126/scirobotics.adi9754" rel="noreferrer noopener" target="_blank">Science Robotics</a>,</em> their study systematically examines a suite of biological and robotic runners to figure out how to further advance our best robotic designs.&nbsp;</p><p>“In robotics design we are often very component focused — we are used to having to establish specifications for the parts that we need and then finding the best component solution,” said Sponberg, who also serves on the executive committee for Georgia Tech's <a href="neuro.gatech.edu">Neuro Next Initiative</a>. “This is of course not how evolution works. We wondered if we systematically analyzed the performance of animals in the same component way that we design robots, if we might see an obvious gap.”&nbsp;</p><p>The gap turns out not to be in the function of individual robotic components, but rather the ability of those components to work together in the seamless way biological components do, highlighting a field of opportunity for new research in robotic development.&nbsp;</p><p>“This means that the frontier is not necessarily figuring out how to design better motors or sensors or controllers,” says Sponberg, “but rather how to integrate them together — this is where biology really excels.”&nbsp;</p><h4><strong>Read more about man versus machine and the future of bioinspired robotics <a href="https://www.ece.uw.edu/spotlight/why-animals-can-outrun-robots/">here</a>.</strong></h4>]]></body>  <author>adavidson38</author>  <status>1</status>  <created>1713987118</created>  <gmt_created>2024-04-24 19:31:58</gmt_created>  <changed>1714681523</changed>  <gmt_changed>2024-05-02 20:25:23</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech Researcher Simon Sponberg collaborates to ask why robotic advancements have yet to outpace animals — and look at what we can learn from biology to engineer new robotic designs.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech Researcher Simon Sponberg collaborates to ask why robotic advancements have yet to outpace animals — and look at what we can learn from biology to engineer new robotic designs.]]></sentence>  <summary><![CDATA[<p>Georgia Tech Researcher Simon Sponberg collaborates to ask why robotic advancements have yet to outpace animals — and look at what we can learn from biology to engineer new robotic designs.</p>]]></summary>  <dateline>2024-05-02T00:00:00-04:00</dateline>  <iso_dateline>2024-05-02T00:00:00-04:00</iso_dateline>  <gmt_dateline>2024-05-02 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Georgia Tech Researcher Collaborates to Advance Bioinspired Design]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[audra.davidson@research.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong><a href="mailto:audra.davidson@research.gatech.edu">Audra Davidson</a></strong><br />Research Communications Program Manager<br />Neuro Next Initiative</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>673838</item>      </media>  <hg_media>          <item>          <nid>673838</nid>          <type>image</type>          <title><![CDATA[mCLARI_Spider.jpg]]></title>          <body><![CDATA[<p>Can this small robot outrun a spider? Photo Credit: Animal Inspired Movement and Robotics Lab, CU Boulder.</p>]]></body>                      <image_name><![CDATA[mCLARI_Spider.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/04/24/mCLARI_Spider.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/04/24/mCLARI_Spider.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/04/24/mCLARI_Spider.jpg?itok=oXeE2GqY]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Can this small robot outrun a spider? Photo Credit: Animal Inspired Movement and Robotics Lab, CU Boulder.]]></image_alt>                    <created>1713987354</created>          <gmt_created>2024-04-24 19:35:54</gmt_created>          <changed>1713987354</changed>          <gmt_changed>2024-04-24 19:35:54</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://research.gatech.edu/georgia-tech-partners-15m-nsf-grant-explore-muscle-dynamics]]></url>        <title><![CDATA[Georgia Tech Partners on $15M NSF Grant to Explore Muscle Dynamics]]></title>      </link>          <link>        <url><![CDATA[https://research.gatech.edu/edge-georgia-tech-professors-awarded-curci-grants-emerging-bio-research-0]]></url>        <title><![CDATA[On The Edge: Georgia Tech Professors Awarded Curci Grants for Emerging Bio Research]]></title>      </link>          <link>        <url><![CDATA[https://research.gatech.edu/feature/ultrafast-flight]]></url>        <title><![CDATA[How Insects Evolved to Ultrafast Flight (And Back)]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="66220"><![CDATA[Neuro]]></group>          <group id="1292"><![CDATA[Parker H. Petit Institute for Bioengineering and Bioscience (IBB)]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1275"><![CDATA[School of Biological Sciences]]></group>          <group id="126011"><![CDATA[School of Physics]]></group>      </groups>  <categories>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="188087"><![CDATA[go-irim]]></keyword>          <keyword tid="172970"><![CDATA[go-neuro]]></keyword>          <keyword tid="192253"><![CDATA[cos-neuro]]></keyword>          <keyword tid="187423"><![CDATA[go-bio]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="181469"><![CDATA[bioinspired design]]></keyword>          <keyword tid="193266"><![CDATA[cos-research]]></keyword>      </keywords>  <core_research_areas>          <term tid="193656"><![CDATA[Neuro Next Initiative]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="673986">  <title><![CDATA[Good Dog: LASSIE Spirit Learns to Walk on the Moon ]]></title>  <uid>34528</uid>  <body><![CDATA[<p><span><span><span><span><em><span>This story by Landon Hall was first published in the </span></em></span></span></span></span><a href="https://viterbischool.usc.edu/news/2024/04/teaching-robots-to-walk-on-the-moon-and-maybe-rescue-one-another/"><span><span><span><span><em><span><span><span>USC Viterbi School of Engineering newsroom</span></span></span></em></span></span></span></span></a><span><span><span><span><em><span>.</span></em></span></span></span></span></p><p><span><span><span><span><em><span>Georgia Tech alumna </span></em></span></span></span></span><span><span><span><strong><em><span>Feifei Qian</span></em></strong></span></span></span><span><span><span><span><em><span> (M.S. PHYS 2011, Ph.D. ECE 2015), an assistant professor of electrical and computer engineering at the USC Viterbi School of Engineering and School of Advanced Computing, leads the NASA LASSIE project alongside co-investigator </span></em></span></span></span></span><span><span><span><strong><em><span>Frances Rivera-Hernández</span></em></strong></span></span></span><span><span><span><span><em><span>, an assistant professor in the School of Earth and Atmospheric Sciences at Georgia Tech. </span></em></span></span></span></span><span><span><span><strong><em><span>Sharissa Thompson</span></em></strong></span></span></span><span><span><span><span><em><span>, a graduate student at Georgia Tech, is a student intern on the NASA Curiosity Rover project. </span></em></span></span></span></span></p><p><span><span><span><span><span><span>The Palmer Glacier on Oregon’s Mount Hood isn’t the Moon, but it’s a good place to practice.</span></span></span></span></span></span></p><p><span><span><span><span><span><span>Some 6,000 feet up the snow-capped mountain, located about 70 miles east of Portland, a multi-disciplinary team from the University of Southern California, Texas A&amp;M University, Georgia Institute of Technology, Oregon State University, Temple University, the University of Pennsylvania, and NASA gathered to turn loose a four-legged robot named Spirit into the wild.</span></span></span></span></span></span></p><p><span><span><span><span><span><span>The team that included engineers, cognitive scientists, geoscientists and planetary scientists field-tested Spirit as part of the LASSIE Project: Legged Autonomous Surface Science in Analog Environments. Spirit covered a variety of challenging terrains, using his spindly metal legs to amble over, across and over around shifting dirt, slushy snow and boulders during five days of testing in summer 2023. Sometimes he expertly traversed the hillside, while at other moments he teetered and fell over. All part of the process to better understand the substrate properties and learn to better walk on these extreme terrains. The practice time Spirit logged produced data that will be used to train future robots for use on intergalactic surfaces, like Earth’s moon and perhaps planets in our solar system.</span></span></span></span></span></span></p><p><span><span><span><span><span><span>“A legged robot needs to be able to detect what is happening when it interacts with the ground underneath, and rapidly adjust its locomotion strategies accordingly,” says </span></span></span></span></span></span><span><span><span><strong><span><span>Feifei Qian</span></span></strong></span></span></span><span><span><span><span><span><span>, an assistant professor of electrical and computer engineering at the USC Viterbi School of Engineering and School of Advanced Computing, which is leading the project funded by NASA. “When the robot leg slips on ice or sinks into soft snow, it inspires us to look for new principles and strategies that can push the boundary of human knowledge and enable new technology. We learn and improve from the observed failures.”</span></span></span></span></span></span></p><p><span><span><span><strong><em><span>Watch this </span></em></strong></span></span></span><a href="https://www.youtube.com/watch?v=wBTyelFFE1A"><span><span><span><strong><em><span><span><span>5-minute video</span></span></span></em></strong></span></span></span></a><span><span><span><strong><em><span> produced for the team by documentary filmmaker Sean Grasso.</span></em></strong></span></span></span></p><p><span><span><span><span><span><span>Spirit learns from every step.</span></span></span></span></span></span></p><p><span><span><span><span><span><span>“Similar to the way that when we walk on uneven surfaces as humans, we can sort of detect how the ground is shifting beneath our feet, a legged robot is capable of the exact same thing,” says </span></span></span></span></span></span><span><span><span><strong><span><span>Cristina Wilson</span></span></strong></span></span></span><span><span><span><span><span><span>, a cognitive scientist at Oregon State University.</span></span></span></span></span></span></p><p><span><span><span><strong><span><span>The more machines the merrier</span></span></strong></span></span></span></p><p><span><span><span><span><span><span>Qian’s group doesn’t intend to stop at just one robot, wandering the wilderness alone. She and her former colleagues at Penn, </span></span></span></span></span></span><span><span><span><strong><span><span>Cynthia Sung</span></span></strong></span></span></span><span><span><span><span><span><span>, </span></span></span></span></span></span><span><span><span><strong><span><span>Mark Yim</span></span></strong></span></span></span><span><span><span><span><span><span>, </span></span></span></span></span></span><span><span><span><strong><span><span>Daniel Koditschek</span></span></strong></span></span></span><span><span><span><span><span><span>, and </span></span></span></span></span></span><span><span><span><strong><span><span>Douglas Jerolmack</span></span></strong></span></span></span><span><span><span><span><span><span>, received a two-year $2 million grant from NASA they’re calling the TRUSSES Project: Temporarily, Robots Unite to Surmount Sandy Entrapments, Then Separate. They want to help the space agency put teams of robots on the Moon and have them work together on tasks. They would take the knowledge they came in with, and the data they collect on the mission, and communicate those details to each other.</span></span></span></span></span></span></p><p><span><span><span><span><span><span>“They would sense how the ground conditions are,” Qian says, “and then exchange that information with one another, and collectively form a map of locomotion risk estimation. The team of robots can then use this traversal risk map to inform their planetary explorations: ‘There is an extremely soft sand patch that might be high-risk for wheeled rovers. Come over here, this might be a safer area.’ ”</span></span></span></span></span></span></p><p><span><span><span><span><span><span>The robots in mind for this kind of work would be more than just Spirit: There would be a wheeled rover (great for payload and long distances), a Hexapedal robot (intermediate payload but better mobility than the wheeled), and dog-like ones like the rugged version of Spirit (highest mobility, shorter distances). And here’s the coolest part of that research, the part that sounds like something the Transformers would do. Or at least a team of castaways on “Survivor”: If one got in a jam, made immovable by loose dirt or a rock or a ravine, his bot-mates would arrive and link together and form a bridge, or a pyramid, to hoist their pal to safety. And then back to work.</span></span></span></span></span></span></p><p><span><span><span><span><span><span>“When they plan for the strategy to pull the robot up, they’ll decide what force to exert and what position the robot should go to, while also compiling the terrain information,” Qian says. “That’s the key idea of how to use these capabilities: to both prevent and recover from locomotion failures in extreme terrain.”</span></span></span></span></span></span></p><p><span><span><span><strong><span><span>Back to Mount Hood</span></span></strong></span></span></span></p><p><span><span><span><span><span><span>Spirit gets around a variety of natural environments, to learn how to better move on challenging terrains. Qian has let him off his leash on Southern California beaches, and the multi-university team has field-tested him in the soft granules of White Sands National Park in New Mexico. But <a href="https://youtu.be/wBTyelFFE1A">the video</a> shot at Mount Hood shows just how otherworldly that landscape can be in these planetary-analogue environments. This provides Spirit with plenty of opportunities to learn on earth, before potentially exploring other planets.</span></span></span></span></span></span></p><p><span><span><span><span><span><span>“You look around us, it would be very hard to drive up this,” </span></span></span></span></span></span><span><span><span><strong><span><span>Ryan Ewing</span></span></strong></span></span></span><span><span><span><span><span><span>, a geologist from NASA Johnson Space Center, </span></span></span></span></span></span><a href="https://youtu.be/wBTyelFFE1A?feature=shared"><span><span><span><span><span><span><span><span>shares</span></span></span></span></span></span></span></span></a><span><span><span><span><span><span>. “But as a legged being, as humans, we can step around it easily. A dog could walk around it easily. So this project is the proving ground that we can enable new science and new mobility on environments that are like other planets.”</span></span></span></span></span></span></p><p><span><span><span><span><span><span>In fact, a dog is indeed frisking about: Howard, Wilson’s German shepherd, wandered about, with the kind of agility Spirit could only dream of.</span></span></span></span></span></span></p><p><span><span><span><span><span><span>“We are going to observe how Howard moves in different types of snow and ice conditions,” Qian </span></span></span></span></span></span><a href="https://youtu.be/wBTyelFFE1A?feature=shared"><span><span><span><span><span><span><span><span>says</span></span></span></span></span></span></span></span></a><span><span><span><span><span><span>. “What exactly, out of those combined motions, allows him to succeed on challenging terrain?”</span></span></span></span></span></span></p><p><span><span><span><span><span><span>The LASSIE Project calls for two more trips for Spirit: to Mount Hood this summer, and to White Sands next year. The TRUSSES team, from USC and Penn, also plans to visit White Sands next year with Spirit and the other, new, multi-tasking robots. Imagine WALL-E with friends.</span></span></span></span></span></span></p><p><span><span><span><strong><span><span>—</span></span></strong></span></span></span></p><p><span><span><span><strong><em><span>The NASA PSTAR (Planetary Science and Technology Through Analog Research) number for this project is 80NSSC22K1313.</span></em></strong></span></span></span></p><p>&nbsp;</p>]]></body>  <author>jhunt7</author>  <status>1</status>  <created>1712241215</created>  <gmt_created>2024-04-04 14:33:35</gmt_created>  <changed>1712247832</changed>  <gmt_changed>2024-04-04 16:23:52</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers at Georgia Tech have teamed up with NASA and five peer institutions to teach dog-like robots to navigate craters of the Moon and other challenging planetary surfaces.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers at Georgia Tech have teamed up with NASA and five peer institutions to teach dog-like robots to navigate craters of the Moon and other challenging planetary surfaces.]]></sentence>  <summary><![CDATA[<p><span><span><span><span><span><span>Scientists at Georgia Tech have teamed up with the University of Southern California (USC), University of Pennsylvania, Texas A&amp;M, Oregon State, Temple University, and NASA Johnson Space Center to teach dog-like robots to navigate craters of the Moon and other challenging planetary surfaces in research funded by NASA.</span></span></span></span></span></span></p>]]></summary>  <dateline>2024-04-03T00:00:00-04:00</dateline>  <iso_dateline>2024-04-03T00:00:00-04:00</iso_dateline>  <gmt_dateline>2024-04-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Researchers at Georgia Tech have teamed up with NASA and five peer institutions to teach dog-like robots to navigate craters of the Moon and other challenging planetary surfaces.]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jess@cos.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:jess@cos.gatech.edu">Jess Hunt-Ralston</a><br />Director of Communications<br />College of Sciences at Georgia Tech</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>673614</item>          <item>673617</item>          <item>673615</item>          <item>673616</item>      </media>  <hg_media>          <item>          <nid>673614</nid>          <type>image</type>          <title><![CDATA[The LASSIE Project’s robot, dubbed Spirit, can “feel” and interpret surface force responses via leg-terrain interactions, assisting planetary scientists with data collection at Oregon’s Mount Hood, a lunar-analog site. (Justin Durner/LASSIE Project)]]></title>          <body><![CDATA[<p>The LASSIE Project’s robot, dubbed Spirit, can “feel” and interpret surface force responses via leg-terrain interactions, assisting planetary scientists with data collection at Oregon’s Mount Hood, a lunar-analog site. (Justin Durner/LASSIE Project)</p>]]></body>                      <image_name><![CDATA[1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/04/04/1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/04/04/1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/04/04/1.jpg?itok=T1_D0waZ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[The LASSIE Project’s robot, dubbed Spirit, can “feel” and interpret surface force responses via leg-terrain interactions, assisting planetary scientists with data collection at Oregon’s Mount Hood, a lunar-analog site. (Justin Durner/LASSIE Project)]]></image_alt>                    <created>1712241534</created>          <gmt_created>2024-04-04 14:38:54</gmt_created>          <changed>1712241534</changed>          <gmt_changed>2024-04-04 14:38:54</gmt_changed>      </item>          <item>          <nid>673617</nid>          <type>image</type>          <title><![CDATA[The LASSIE Project Team — humans and robots — pictured at Mount Hood in summer 2023. (Justin Durner/LASSIE Project)]]></title>          <body><![CDATA[<p><span><span><span><span><span><span>The LASSIE Project Team — humans and robots — pictured at Mount Hood in summer 2023. (Justin Durner/LASSIE Project)</span></span></span></span></span></span></p>]]></body>                      <image_name><![CDATA[4.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/04/04/4.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/04/04/4.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/04/04/4.jpg?itok=sb7OzWnG]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[The LASSIE Project Team — humans and robots — pictured at Mount Hood in summer 2023. (Justin Durner/LASSIE Project)]]></image_alt>                    <created>1712241799</created>          <gmt_created>2024-04-04 14:43:19</gmt_created>          <changed>1712241799</changed>          <gmt_changed>2024-04-04 14:43:19</gmt_changed>      </item>          <item>          <nid>673615</nid>          <type>image</type>          <title><![CDATA[Georgia Tech alumna Feifei Qian (M.S. PHYS 2011, Ph.D. ECE 2015), an assistant professor of electrical and computer engineering at the USC Viterbi School of Engineering and School of Advanced Computing, is leading the project funded by NASA.]]></title>          <body><![CDATA[<p><span><span><span><span><span><span>Georgia Tech alumna Feifei Qian (M.S. PHYS 2011, Ph.D. ECE 2015), an assistant professor of electrical and computer engineering at the USC Viterbi School of Engineering and School of Advanced Computing, is leading the project funded by NASA.</span></span></span></span></span></span></p>]]></body>                      <image_name><![CDATA[2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/04/04/2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/04/04/2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/04/04/2.jpg?itok=PelfzTUN]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Georgia Tech alumna Feifei Qian (M.S. PHYS 2011, Ph.D. ECE 2015), an assistant professor of electrical and computer engineering at the USC Viterbi School of Engineering and School of Advanced Computing, is leading the project funded by NASA.]]></image_alt>                    <created>1712241625</created>          <gmt_created>2024-04-04 14:40:25</gmt_created>          <changed>1712241625</changed>          <gmt_changed>2024-04-04 14:40:25</gmt_changed>      </item>          <item>          <nid>673616</nid>          <type>image</type>          <title><![CDATA[Frances Rivera-Hernández, an assistant professor in the School of Earth and Atmospheric Sciences at Georgia Tech, is helping develop a new generation of robots and rovers that can handle difficult terrain on the Moon, Mars, and other space destinations.]]></title>          <body><![CDATA[<p><span><span><span><span><span><span>Frances Rivera-Hernández, an assistant professor in the School of Earth and Atmospheric Sciences at Georgia Tech, is helping develop a new generation of robots and rovers that can handle difficult terrain on the Moon, Mars, and other space destinations.</span></span></span></span></span></span></p>]]></body>                      <image_name><![CDATA[3.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/04/04/3.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/04/04/3.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/04/04/3.jpg?itok=Zsu38Cst]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Frances Rivera-Hernández, an assistant professor in the School of Earth and Atmospheric Sciences at Georgia Tech, is helping develop a new generation of robots and rovers that can handle difficult terrain on the Moon, Mars, and other space destinations.]]></image_alt>                    <created>1712241670</created>          <gmt_created>2024-04-04 14:41:10</gmt_created>          <changed>1712241670</changed>          <gmt_changed>2024-04-04 14:41:10</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://cos.gatech.edu/news/frances-rivera-hernandez-lands-nasa-and-scialog-grants-planetary-research-signatures-life]]></url>        <title><![CDATA[Frances Rivera-Hernández Lands NASA and Scialog Grants for Planetary Research, Signatures of Life]]></title>      </link>          <link>        <url><![CDATA[https://viterbischool.usc.edu/news/2024/04/teaching-robots-to-walk-on-the-moon-and-maybe-rescue-one-another/]]></url>        <title><![CDATA[Teaching robots to walk on the moon, and maybe rescue one another]]></title>      </link>          <link>        <url><![CDATA[https://today.tamu.edu/2024/04/03/practice-makes-perfect-teaching-robots-to-walk-on-the-moon/]]></url>        <title><![CDATA[Practice Makes Perfect: Teaching Robots To Walk On The Moon]]></title>      </link>          <link>        <url><![CDATA[https://ntrs.nasa.gov/citations/20230000243]]></url>        <title><![CDATA[NASA LASSIE: Legged Autonomous Surface Science In Analogue Environments]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="364801"><![CDATA[School of Earth and Atmospheric Sciences (EAS)]]></group>          <group id="126011"><![CDATA[School of Physics]]></group>      </groups>  <categories>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="192252"><![CDATA[cos-planetary]]></keyword>          <keyword tid="193266"><![CDATA[cos-research]]></keyword>          <keyword tid="408"><![CDATA[NASA]]></keyword>          <keyword tid="187439"><![CDATA[Frances Rivera-Hernandez]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="673305">  <title><![CDATA[IRIM Director Delivers Keynote at Hyundai Meta-Factory Conference]]></title>  <uid>27863</uid>  <body><![CDATA[<p><span><span><span>Hyundai Motor Group Innovation Center Singapore hosted the Meta-Factory Conference Jan. 23 – 24. It brought together academic leaders, industry experts, and manufacturing companies to discuss technology and the next generation of integrated manufacturing facilities. </span></span></span></p><p><span><span><span>Seth Hutchinson, executive director of the Institute for Robotics and Intelligent Machines at Georgia Tech, delivered a keynote lecture on “The Impacts of Today’s Robotics Innovation on the Relationship Between Robots and Their Human Co-Workers in Manufacturing Applications” — an overview of current state-of-the-art robotic technologies and future research trends for developing robotics aimed at interactions with human workers in manufacturing.</span></span></span></p><p><span><span><span>In addition to the keynote, Hutchinson also participated in the Hyundai Motor Group's Smart Factory Executive Technology Advisory Committee (E-TAC) panel on comprehensive future manufacturing directions and toured the new Hyundai Meta-Factory to observe how digital-twin technology is being applied in their human-robot collaborative manufacturing environment.</span></span></span></p><p><span><span><span>Hutchinson is a professor in the School of Interactive Computing. He received his Ph.D. from Purdue University in 1988, and in 1990 joined the University of Illinois Urbana-Champaign, where he was professor of electrical and computer engineering until 2017 and is currently professor emeritus. He has served on the Hyundai Motor Group's Smart Factory E-TAC since 2022.</span></span></span></p><p><span><span><span>Hyundai Motor Group Innovation Center Singapore is Hyundai Motor Group’s open innovation hub to support research and development of human-centered smart manufacturing processes using advanced technologies such as artificial intelligence, the Internet of Things, and robotics. </span></span></span></p><p>- Christa M. Ernst</p><p><span><span><span>Related Links</span></span></span></p><ul><li><span><span><span><a href="https://www.hyundai.com/sg/newsroom?tmplSeq=662&amp;curLv=1&amp;scrnKnd=W&amp;intzYn=Y&amp;dtlYn=Y&amp;lstYn=N&amp;finYn=N&amp;urlChgYn=N&amp;scrnPrmt=%7B%22page%22%3A%221%22%2C%22perPage%22%3A%229%22%2C%22bbSeq%22%3A%221057%22%2C%22bbClssCd%22%3A%22LN%22%2C%22srchExpYn%22%3A%22Y%22%2C%22srchRsvYn%22%3A%22N%22%2C%22srchRsvDateYn%22%3A%22Y%22%2C%22srchExpType%22%3A%22%22%2C%22srchDispYn%22%3A%22N%22%2C%22srchBbLrclCd%22%3A%22%22%7D&amp;caloUrl=&amp;fsc=&amp;exClrCd=&amp;itClrCd=&amp;acptChnnel=&amp;bbNo=&amp;inqLrcl=&amp;inqSmcl=&amp;mdlCd=&amp;mdlyCd=&amp;preCaloUrl=%2Fapi%2Fmktg%2FgetLclNewsList&amp;caloReqPrmt=%7B%22perPage%22%3A%2210%22%2C%22bbClssCd%22%3A%22LN%22%2C%22srchExpYn%22%3A%22Y%22%2C%22srchRsvYn%22%3A%22N%22%2C%22srchRsvDateYn%22%3A%22Y%22%2C%22srchExpTpCd%22%3A%223%22%7D&amp;addInfoPrmt=">Hyundai Newsroom Article: Link</a></span></span></span></li><li><span><span><span>Event Link: <a href="https://mfc2024.com/">https://mfc2024.com/</a></span></span></span></li><li><span><span><span>Keynote Speakers: <a href="https://mfc2024.com/keynotes/">https://mfc2024.com/keynotes/</a></span></span></span></li></ul>]]></body>  <author>Christa Ernst</author>  <status>1</status>  <created>1709570379</created>  <gmt_created>2024-03-04 16:39:39</gmt_created>  <changed>1709571315</changed>  <gmt_changed>2024-03-04 16:55:15</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Seth Hutchinson, executive director of the Institute for Robotics and Intelligent Machines at Georgia Tech, delivered a keynote lecture at the 2024 Hyundai Meta-Factory Conference.]]></teaser>  <type>news</type>  <sentence><![CDATA[Seth Hutchinson, executive director of the Institute for Robotics and Intelligent Machines at Georgia Tech, delivered a keynote lecture at the 2024 Hyundai Meta-Factory Conference.]]></sentence>  <summary><![CDATA[<p><span><span><span>Seth Hutchinson, executive director of the Institute for Robotics and Intelligent Machines at Georgia Tech, delivered a keynote lecture on “The Impacts of Today’s Robotics Innovation on the Relationship Between Robots and Their Human Co-Workers in Manufacturing Applications”</span></span></span></p>]]></summary>  <dateline>2024-03-04T00:00:00-05:00</dateline>  <iso_dateline>2024-03-04T00:00:00-05:00</iso_dateline>  <gmt_dateline>2024-03-04 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p><strong><span>Christa M. Ernst - Research Communications Program Manager</span></strong></p><p><span>christa.ernst@research.gatech.edu</span></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>673288</item>          <item>673289</item>      </media>  <hg_media>          <item>          <nid>673288</nid>          <type>image</type>          <title><![CDATA[Seth Hutchinson at Hyundai Meta Factory Conference - Keynote]]></title>          <body><![CDATA[<p>IRIM Director Seth Hutchinson at Hyundai Meta Factory Conference Delivering Keynote</p>]]></body>                      <image_name><![CDATA[Seth at MFC 1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/03/04/Seth%20at%20MFC%201.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/03/04/Seth%20at%20MFC%201.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/03/04/Seth%2520at%2520MFC%25201.jpg?itok=ArXdsGSi]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[IRIM Director Seth Hutchinson at Hyundai Meta Factory Conference Delivering Keynote]]></image_alt>                    <created>1709569864</created>          <gmt_created>2024-03-04 16:31:04</gmt_created>          <changed>1709569863</changed>          <gmt_changed>2024-03-04 16:31:03</gmt_changed>      </item>          <item>          <nid>673289</nid>          <type>image</type>          <title><![CDATA[Seth Hutchinson at Hyundai Meta Factory Conference - Panel]]></title>          <body><![CDATA[<p>IRIM Director Seth Hutchinson at Hyundai Meta Factory Conference on Panel Discussion</p>]]></body>                      <image_name><![CDATA[Seth at MFC.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2024/03/04/Seth%20at%20MFC.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2024/03/04/Seth%20at%20MFC.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2024/03/04/Seth%2520at%2520MFC.jpg?itok=EecQ001b]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[IRIM Director Seth Hutchinson at Hyundai Meta Factory Conference on Panel Discussion]]></image_alt>                    <created>1709570019</created>          <gmt_created>2024-03-04 16:33:39</gmt_created>          <changed>1709570018</changed>          <gmt_changed>2024-03-04 16:33:38</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://mfc2024.com/keynotes/]]></url>        <title><![CDATA[META Factory Conference 2024  Keynote Speakers]]></title>      </link>          <link>        <url><![CDATA[https://www.hyundai.com/sg/newsroom?tmplSeq=662&amp;curLv=1&amp;scrnKnd=W&amp;intzYn=Y&amp;dtlYn=Y&amp;lstYn=N&amp;finYn=N&amp;urlChgYn=N&amp;scrnPrmt=%7B%22page%22%3A%221%22%2C%22perPage%22%3A%229%22%2C%22bbSeq%22%3A%221057%22%2C%22bbClssCd%22%3A%22LN%22%2C%22srchExpYn%22%3A%22Y%22%2C%22srchRsvYn%22%3A%22N%22%2C%22srchRsvDateYn%22%3A%22Y%22%2C%22srchExpType%22%3A%22%22%2C%22srchDispYn%22%3A%22N%22%2C%22srchBbLrclCd%22%3A%22%22%7D&amp;caloUrl=&amp;fsc=&amp;exClrCd=&amp;itClrCd=&amp;acptChnnel=&amp;bbNo=&amp;inqLrcl=&amp;inqSmcl=&amp;mdlCd=&amp;mdlyCd=&amp;preCaloUrl=%2Fapi%2Fmktg%2FgetLclNewsList&amp;caloReqPrmt=%7B%22perPage%22%3A%2210%22%2C%22bbClssCd%22%3A%22LN%22%2C%22srchExpYn%22%3A%22Y%22%2C%22srchRsvYn%22%3A%22N%22%2C%22srchRsvDateYn%22%3A%22Y%22%2C%22srchExpTpCd%22%3A%223%22%7D&amp;addInfoPrmt=]]></url>        <title><![CDATA[Hyundai Newsroom - META Factory Conference 2024 ]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="545781"><![CDATA[Institute for Data Engineering and Science]]></group>          <group id="142761"><![CDATA[IRIM]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="131"><![CDATA[Economic Development and Policy]]></category>          <category tid="132"><![CDATA[Institute Leadership]]></category>          <category tid="152"><![CDATA[Robotics]]></category>          <category tid="134"><![CDATA[Student and Faculty]]></category>      </categories>  <news_terms>          <term tid="131"><![CDATA[Economic Development and Policy]]></term>          <term tid="132"><![CDATA[Institute Leadership]]></term>          <term tid="152"><![CDATA[Robotics]]></term>          <term tid="134"><![CDATA[Student and Faculty]]></term>      </news_terms>  <keywords>          <keyword tid="188087"><![CDATA[go-irim]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="186857"><![CDATA[go-gtmi]]></keyword>      </keywords>  <core_research_areas>          <term tid="39461"><![CDATA[Manufacturing, Trade, and Logistics]]></term>          <term tid="39511"><![CDATA[Public Service, Leadership, and Policy]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="670207">  <title><![CDATA[New Robot Learns Object Arrangement Preferences Without User Input]]></title>  <uid>32045</uid>  <body><![CDATA[<p>Kartik Ramachandruni knew he would need to find a unique approach to a populated research field.</p><p>With a handful of students and researchers at Georgia Tech looking to make breakthroughs in home robotics and object rearrangement, Ramachandruni searched for what others had overlooked.</p><p>“To an extent it was challenging, but it was also an opportunity to look at what people are already doing and to get more familiar with the literature,” said Ramachandruni, a Ph.D. student in Robotics. “(Associate) Professor (Sonia) Chernova helped me in deciding how to zone in on the problem and choose a unique perspective.”</p><p>Ramachandruni started exploring how a home robot might organize objects according to user preferences in a pantry or refrigerator without prior instructions required by existing frameworks.</p><p>His persistence paid off. The 2023<a href="https://ieee-iros.org"> IEEE International Confrence on Robots and Systems (IROS)</a> accepted Ramachandruni’s paper on a novel framework for a context-aware object rearrangement robot.</p><p>“Our goal is to build assistive robots that can perform these organizational tasks,” Ramachandruni said. “We want these assistive robots to model the user preferences for a better user experience. We don’t want the robot to come into someone’s home and be unaware of these preferences, rearrange their home in a different way, and cause the users to be distressed. At the same time, we don’t want to burden the user with explaining to the robot exactly how they want the robot to organize their home.”</p><p>Ramachandruni’s object rearrangement framework, Context-Aware Semantic Object Rearrangement (ConSOR), uses contextual clues from a pre-arranged environment within its environment to mimic how a person might arrange objects in their kitchen.</p><p>“If our ConSOR robot rearranged your fridge, it would first observe where objects are already placed to understand how you prefer to organize your fridge,” he said. “The robot then places new objects in a way that does not disrupt your organizational style.”</p><p>The only prior knowledge the robot needs is how to recognize certain objects such as a milk carton or a box of cereal. Ramachandruni said he pretrained the model on language datasets that map out objects hierarchically.</p><p>“The semantic knowledge database we use for training is a hierarchy of words similar to what you would see on a website such as Walmart, where objects are organized by shopping category,” he said. “We incorporate this commonsense knowledge about object categories to improve organizational performance.</p><p>“Embedding commonsense knowledge also means our robot can rearrange objects it hasn’t been trained on. Maybe it’s never seen a soft drink, but it generally knows what beverages are because it’s trained on another object that belongs to the beverage category.”</p><p>Ramachandruni tested ConSOR against two model training baselines. One used a score-based approach that learns how specific users group objects in an environment. It then uses the scores to organize objects for users. The other baseline used the GPT-3 large language model prompted with minimal demonstrations and without fine-tuning to determine the placement of new objects. ConSOR outperformed both baselines.</p><p>“GPT-3 was a baseline we were comparing against to see whether this huge body of common-sense knowledge can be used directly without any sort of frame,” Ramachandruni said. “The appeal of LLMs is you don’t need too much data; you just need a small data set to prompt it and give it an idea. We found the LLM did not have the correct inductive bias to correctly reason between different objects to perform this task.”</p><p>Ramachandruni said he anticipates there will be scenarios where user input is required. His future work on the project will include minimizing the effort required by the user in those scenarios to tell the robot its preferences.</p><p>“There are probably scenarios where it’s just easier to ask the user,” he said. “Let’s say the robot has multiple ideas of how to organize the home, and it’s having trouble deciding between them. Sometimes it’s just easier to ask the user to choose between the options. That would be a human-robot interaction addition to this framework.”</p><p>IROS is taking place this week in Detroit.</p>]]></body>  <author>Ben Snedeker</author>  <status>1</status>  <created>1696527361</created>  <gmt_created>2023-10-05 17:36:01</gmt_created>  <changed>1696598845</changed>  <gmt_changed>2023-10-06 13:27:25</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Award-winning research from Georgia Tech is empowering robots to use contextual clues to mimic how an individual organizes their pantry.]]></teaser>  <type>news</type>  <sentence><![CDATA[Award-winning research from Georgia Tech is empowering robots to use contextual clues to mimic how an individual organizes their pantry.]]></sentence>  <summary><![CDATA[<p>New research from Georgia Tech's School of Interactive Computing is empowering robots to use contextual clues to mimic how an individual might organize their pantry or refrigerator. The novel&nbsp;framework, accepted to this week's&nbsp;2023<a href="https://ieee-iros.org/">&nbsp;IEEE International Confrence on Robots and Systems (IROS)</a>, allows home robots to organize objects in a user's environment based on contextual clues and user preferences, minimizing the need for explicit instructions.</p>]]></summary>  <dateline>2023-10-05T00:00:00-04:00</dateline>  <iso_dateline>2023-10-05T00:00:00-04:00</iso_dateline>  <gmt_dateline>2023-10-05 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Nathan Deen, Communications Officer I</p><p>School of Interactive Computing</p><p>nathan.deen@cc.gatech.edu</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>671966</item>          <item>671967</item>      </media>  <hg_media>          <item>          <nid>671966</nid>          <type>image</type>          <title><![CDATA[Kartik Ramachandruni-roboticsPhD-linkedin-crop-oct23.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Kartik Ramachandruni-roboticsPhD-linkedin-crop-oct23.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2023/10/06/Kartik%20Ramachandruni-roboticsPhD-linkedin-crop-oct23.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2023/10/06/Kartik%20Ramachandruni-roboticsPhD-linkedin-crop-oct23.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2023/10/06/Kartik%2520Ramachandruni-roboticsPhD-linkedin-crop-oct23.jpg?itok=X2IOB4IS]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Georgia Tech robotics Ph.D. student Kartik Ramachandruni poses with a couple of his robot buddies.]]></image_alt>                    <created>1696598297</created>          <gmt_created>2023-10-06 13:18:17</gmt_created>          <changed>1696598297</changed>          <gmt_changed>2023-10-06 13:18:17</gmt_changed>      </item>          <item>          <nid>671967</nid>          <type>image</type>          <title><![CDATA[GT Computing Associate Professor Sonia Chernova_teaching-fall2023.jpg]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[GT Computing Associate Professor Sonia Chernova_teaching-fall2023.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/2023/10/06/GT%20Computing%20Associate%20Professor%20Sonia%20Chernova_teaching-fall2023.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2023/10/06/GT%20Computing%20Associate%20Professor%20Sonia%20Chernova_teaching-fall2023.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2023/10/06/GT%2520Computing%2520Associate%2520Professor%2520Sonia%2520Chernova_teaching-fall2023.jpg?itok=Ou9wmCxP]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Georgis Tech School of Interactive Computing Associate Professor Sonia Chernova presents during a recent robotics seminar.]]></image_alt>                    <created>1696598419</created>          <gmt_created>2023-10-06 13:20:19</gmt_created>          <changed>1696598419</changed>          <gmt_changed>2023-10-06 13:20:19</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="10199"><![CDATA[Daily Digest]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="669031">  <title><![CDATA[Novel Policy Allows Robots to Perform Interactive Tasks in Sequential Order]]></title>  <uid>32045</uid>  <body><![CDATA[<p>Georgia Tech Ph.D. student Niranjan Kumar created the Cascaded Compositional Residual Learning (CCRL) framework, enabling a quadrupedal robot to perform increasingly complex tasks without relearning motions, mirroring human learning, showcased by the robot opening a heavy door using energy transfer, a remarkable achievement in robotics.</p><p>The CCRL, however, functions as a “library” that allows the robot to remember everything it has learned while performing the simple tasks. Each newly obtained skill is added to the library and leveraged for more complex skills. A turning motion, for instance, can be learned on top of walking while serving as the basis for navigation skills.</p><p>Kumar said CCRL has broken new ground on interactive navigation research. Interactive navigation is one of several navigation solutions that allow robots to navigate in the real world. These solutions include point navigation, which trains a robot to reach a point on a map, and object navigation, which teaches it to reach a selected object.</p><p>Interactive navigation requires a robot to reach a goal location while interacting with obstacles on the way, which has proven to be the most difficult for robots to learn.</p><p>The key, Kumar said, to get a robot to go from walking to pushing an object is in the joints and the robot discovering the different types of motions it can make with them.</p><p>So far, Kumar’s policy has reached 10 skills that a robot can learn and deploy. The number of skills it can learn on one policy depends on the hardware the programmer is using.</p><p>“It just takes longer to train as you keep adding more skills because now the policy also has to figure out how to incorporate all these skills in different situations,” he said. “But theoretically, you can keep adding more skills indefinitely as long as you have a powerful enough computer to run the policies.”</p><p>Kumar said he sees CCRL being useful for home assistant robots, which are required to be agile and limber to navigate around a cluttered household. He also said it could possibly serve as a guide dog for the visually impaired.</p><p>“If you have obstacles in front of someone who is visually impaired, the robot can just clear up the obstacles as the person is walking, open the door for them, and things like that,” he said.</p>]]></body>  <author>Ben Snedeker</author>  <status>1</status>  <created>1692362498</created>  <gmt_created>2023-08-18 12:41:38</gmt_created>  <changed>1693495581</changed>  <gmt_changed>2023-08-31 15:26:21</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A Georgia Tech Ph.D. student has created a new framework that enables a four-legged robot to perform increasingly complex tasks without relearning motions.]]></teaser>  <type>news</type>  <sentence><![CDATA[A Georgia Tech Ph.D. student has created a new framework that enables a four-legged robot to perform increasingly complex tasks without relearning motions.]]></sentence>  <summary><![CDATA[<p>Georgia Tech Ph.D. student Niranjan Kumar created the Cascaded Compositional Residual Learning (CCRL) framework, enabling a quadrupedal robot to perform increasingly complex tasks without relearning motions, mirroring human learning, showcased by the robot opening a heavy door using energy transfer, a remarkable achievement in robotics.</p>]]></summary>  <dateline>2023-08-18T00:00:00-04:00</dateline>  <iso_dateline>2023-08-18T00:00:00-04:00</iso_dateline>  <gmt_dateline>2023-08-18 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[nathan.deen@cc.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Nathan Deen</p><p>Communications Officer I</p><p>School of Interactive Computing</p><p>nathan.deen@cc.gatech.edu</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>671422</item>      </media>  <hg_media>          <item>          <nid>671422</nid>          <type>image</type>          <title><![CDATA[A four-legged robot at Georgia Tech opens door using sequential steps, but for the first time without having to relearn motions.]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[March_16 interactive reach_crop.png]]></image_name>            <image_path><![CDATA[/sites/default/files/2023/08/18/March_16%20interactive%20reach_crop.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2023/08/18/March_16%20interactive%20reach_crop.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/2023/08/18/March_16%2520interactive%2520reach_crop.png?itok=UxBO2r5I]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[A four-legged robot at Georgia Tech opens door using sequential steps, but for the first time without having to relearn motions.]]></image_alt>                    <created>1692362511</created>          <gmt_created>2023-08-18 12:41:51</gmt_created>          <changed>1692362511</changed>          <gmt_changed>2023-08-18 12:41:51</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://youtu.be/vKk6NH6Gnug]]></url>        <title><![CDATA[Four-legged robot kicks open door at Georgia Tech]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="663073">  <title><![CDATA[GTRI's SEEDLab Ground Zero for Lunar Flashlight Project]]></title>  <uid>35832</uid>  <body><![CDATA[<p>The <a href="https://www.gtri.gatech.edu/newsroom/lunar-flashlight">Lunar Flashlight</a> is small for a satellite, but could be big for research.</p><p>NASA plans to launch Lunar Flashlight, a small satellite (SmallSat) about the size of a briefcase that will use lasers to search for water ice inside craters at the Moon&rsquo;s unexplored South Pole.</p><p><a href="https://www.nasa.gov/feature/jpl/nasa-s-lunar-flashlight-ready-to-search-for-the-moon-s-water-ice">NASA says</a> that the Lunar Flashlight, traveling aboard a SpaceX Falcon 9 rocket, will take about three months to reach its &ldquo;science orbit.&rdquo; The launch itself has been delayed:&nbsp;SpaceX has pushed back the launch several times. Currently, it is expected to launch later this month.&nbsp;</p><p>The work on earth leading up to the launch has already taken quite some time.</p><p>Georgia Tech and GTRI have been instrumental in the development of the Lunar Flashlight mission. Researchers in Georgia Tech&rsquo;s School of Aerospace Engineering worked with NASA&rsquo;s Marshall Space Flight Center to develop the SmallSat&rsquo;s novel propulsion system. Georgia Tech Research Institute (GTRI) collaborated to assemble and test the Lunar Flashlight.</p><p>Seasoned researchers were assisted by students in their efforts.</p><p>One such student is Mary Kate Broadway, a student assistant in GTRI&rsquo;s Electro-Optical Systems Laboratory (EOSL), whose academic and professional experiences in modeling and fabrication were called upon to create a near 1:1 model of the Lunar Flashlight SmallSat.</p><p>Broadway, who is pursuing a bachelor&rsquo;s degree in mechatronics, robotics, and automation engineering at Kennesaw State University, used GTRI&rsquo;s <a href="https://webwise.gtri.gatech.edu/communities/working-groups/workplace-enhancement-working-group/seedlab">SEEDLab makerspace</a> to fashion the model based on designs produced by NASA.</p><p>&ldquo;I got the SolidWorks (a popular solid modeling computer-aided design and computer-aided engineering application) file, and then I started by taking all the SolidWorks parts, making the 3D printables, and then exporting them out as &lsquo;.stl&rsquo; files. Here (at the SEEDLab), I queued everything up and printed it,&rdquo; Broadway explains. She did &ldquo;all of the painting and the printing&rdquo; by herself. &quot;However, of course, the SEEDLab helpers (student assistants) all helped me whenever I had trouble.&rdquo;</p><p>Broadway, who already has a BFA in animation and digital arts from Florida State University, has the savvy to make use of the SEEDLab&rsquo;s wide variety of equipment.</p><p>For the Lunar Flashlight project, Broadway employed:</p><ul><li>An Ultimaker S5 FDM, a fused-filament fabrication 3D printer.</li><li>A FormLabs Cameo resin printer.</li><li>A Glowforge 3D laser printer and cutter.</li><li>Various traditional hand tools.</li></ul><p>Broadway employed traditional materials such as PET and PLA plastics for some of the more intricate parts of the model. The main body of the model is aluminum, which Broadway collaborated with the Aero Maker Space on the Georgia Tech campus to get pressed and fashioned to specifications with a Waterjet cutting machine. To simulate working solar panels, Broadway designed printed vinyl labels.</p><p>Broadway&rsquo;s supervisor, EOSL Research Engineer Eric Brown, was initially contacted by Principal Research Engineer Jud Ready, Ph.D., who has worked extensively with NASA. Ready has been the liaison to NASA, reporting on Broadway&rsquo;s progress.</p><p>As of Nov. 4, just days before the Lunar Flashlight launch, Broadway was still engrossed in making final adjustments to the model, particularly the tight tolerances of its solar arrays. Broadway began working on the Lunar Flashlight project in April. Working part-time at the SEEDLab, she has spent dozens of hours&mdash;amounting to about a month of work--perfecting the device.</p><p><strong>Writer: Christopher Weems</strong></p><p><strong>Photos: Sean McNeil</strong></p><p>GTRI Communications<br />Georgia Tech Research Institute<br />Atlanta, Georgia USA</p><p>The&nbsp;<a href="https://gtri.gatech.edu/"><strong>Georgia Tech Research Institute (GTRI)</strong></a>&nbsp;is the nonprofit, applied research division of the Georgia Institute of Technology (Georgia Tech). Founded in 1934 as the Engineering Experiment Station, GTRI has grown to more than 2,800 employees, supporting eight laboratories in over 20 locations around the country and performing more than $700 million of problem-solving research annually for government and industry. GTRI&#39;s renowned researchers combine science, engineering, economics, policy, and technical expertise to solve complex problems for the U.S. federal government, state, and industry.</p><p>&nbsp;</p>]]></body>  <author>Michelle Gowdy</author>  <status>1</status>  <created>1668085518</created>  <gmt_created>2022-11-10 13:05:18</gmt_created>  <changed>1668183677</changed>  <gmt_changed>2022-11-11 16:21:17</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Mary Kate Broadway, a student assistant in GTRI’s Electro-Optical Systems Laboratory (EOSL), whose academic and professional experiences in modeling and fabrication were called upon to create a near 1:1 model of the Lunar Flashlight SmallSat.]]></teaser>  <type>news</type>  <sentence><![CDATA[Mary Kate Broadway, a student assistant in GTRI’s Electro-Optical Systems Laboratory (EOSL), whose academic and professional experiences in modeling and fabrication were called upon to create a near 1:1 model of the Lunar Flashlight SmallSat.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2022-11-10T00:00:00-05:00</dateline>  <iso_dateline>2022-11-10T00:00:00-05:00</iso_dateline>  <gmt_dateline>2022-11-10 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[michelle.gowdy@gtri.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>(Interim) Director of Communications</p><p>Michelle Gowdy</p><p>Michelle.Gowdy@gtri.gatech.edu</p><p>404-407-8060</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>663071</item>      </media>  <hg_media>          <item>          <nid>663071</nid>          <type>image</type>          <title><![CDATA[GTRI's Mary Kate Broadway]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[2022_1104_image_Lunar Flashlight SEEDLab_Mary Kate Broadway_04.JPG]]></image_name>            <image_path><![CDATA[/sites/default/files/images/2022_1104_image_Lunar%20Flashlight%20SEEDLab_Mary%20Kate%20Broadway_04.JPG]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/2022_1104_image_Lunar%20Flashlight%20SEEDLab_Mary%20Kate%20Broadway_04.JPG]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/2022_1104_image_Lunar%2520Flashlight%2520SEEDLab_Mary%2520Kate%2520Broadway_04.JPG?itok=5fR8nfRJ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1668084096</created>          <gmt_created>2022-11-10 12:41:36</gmt_created>          <changed>1668084096</changed>          <gmt_changed>2022-11-10 12:41:36</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1276"><![CDATA[Georgia Tech Research Institute (GTRI)]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="365"><![CDATA[Research]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="166902"><![CDATA[science and technology]]></keyword>          <keyword tid="191623"><![CDATA[SEEDLab]]></keyword>          <keyword tid="169609"><![CDATA[satellite]]></keyword>          <keyword tid="188307"><![CDATA[Lunar Flashlight]]></keyword>          <keyword tid="167146"><![CDATA[space]]></keyword>          <keyword tid="191624"><![CDATA[SmallSat]]></keyword>          <keyword tid="408"><![CDATA[NASA]]></keyword>          <keyword tid="191625"><![CDATA[SpaceX Falcon 9 rocket]]></keyword>          <keyword tid="2082"><![CDATA[aerospace engineering]]></keyword>          <keyword tid="167880"><![CDATA[SpaceX]]></keyword>          <keyword tid="187527"><![CDATA[orbit]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="7689"><![CDATA[EOSL]]></keyword>          <keyword tid="191626"><![CDATA[SolidWorks]]></keyword>          <keyword tid="191627"><![CDATA[automation engineering]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="659461">  <title><![CDATA[Skin: An Additional Tool for the Versatile Elephant Trunk]]></title>  <uid>27560</uid>  <body><![CDATA[<p>A new study from the Georgia Institute of Technology suggests that an elephant&rsquo;s muscles aren&rsquo;t the only way it stretches its trunk &mdash; <a href="https://youtu.be/3N8WBlk-inA">its folded skin also plays an important role</a>. The combination of muscle and skin gives the animal the versatility to grab fragile vegetation and rip apart tree trunks.</p><p>The research, in collaboration with Zoo Atlanta, finds that an elephant&rsquo;s skin doesn&rsquo;t uniformly stretch. The top of the trunk is more flexible than the bottom, and the two sections begin to diverge when an elephant reaches more than 10%. When stretching for food or objects, the dorsal section of the trunk slides further forward.&nbsp;&nbsp;</p><p>The findings could improve robotics, which today are typically built for either great strength or flexibility. Unlike an elephant&rsquo;s trunk, the machines can&rsquo;t do both.</p><p><a href="https://coe.gatech.edu/news/2022/07/skin-additional-tool-versatile-elephant-trunk">Read about the study and see video from the experiments</a>.&nbsp;</p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1658170460</created>  <gmt_created>2022-07-18 18:54:20</gmt_created>  <changed>1661357533</changed>  <gmt_changed>2022-08-24 16:12:13</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Skin plays an important role in allowing an elephant to stretch its trunk to grab food and other items.]]></teaser>  <type>news</type>  <sentence><![CDATA[Skin plays an important role in allowing an elephant to stretch its trunk to grab food and other items.]]></sentence>  <summary><![CDATA[<p>A new study from the Georgia Institute of Technology suggests that an elephant&rsquo;s muscles aren&rsquo;t the only way it stretches its trunk &mdash; its folded skin also plays an important role. The combination of muscle and skin gives the animal the versatility to grab fragile vegetation and rip apart tree trunks. The findings could help build more flexible robotics.</p>]]></summary>  <dateline>2022-07-18T00:00:00-04:00</dateline>  <iso_dateline>2022-07-18T00:00:00-04:00</iso_dateline>  <gmt_dateline>2022-07-18 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Elephant biomechanics suggests a new approach for soft robotics]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />College of Engineering<br />maderer@gatech.edu</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>659460</item>      </media>  <hg_media>          <item>          <nid>659460</nid>          <type>image</type>          <title><![CDATA[Elephant]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[elephant_kelly_homepage1 (1).jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/elephant_kelly_homepage1%20%281%29.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/elephant_kelly_homepage1%20%281%29.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/elephant_kelly_homepage1%2520%25281%2529.jpg?itok=Jtzh1tyX]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1658170078</created>          <gmt_created>2022-07-18 18:47:58</gmt_created>          <changed>1658170078</changed>          <gmt_changed>2022-07-18 18:47:58</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1237"><![CDATA[College of Engineering]]></group>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="108731"><![CDATA[School of Mechanical Engineering]]></group>          <group id="1275"><![CDATA[School of Biological Sciences]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="187423"><![CDATA[go-bio]]></keyword>          <keyword tid="166882"><![CDATA[School of Biological Sciences]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="658953">  <title><![CDATA[Introducing GTGraffiti: The Robot That Paints Like a Human]]></title>  <uid>36123</uid>  <body><![CDATA[<p>Graduate students at the Georgia Institute of Technology have built the first graffiti-painting robot system that mimics the fluidity of human movement. Aptly named GTGraffiti, the system uses motion capture technology to record human painting motions and then composes and processes the gestures to program a cable-driven robot that spray paints graffiti artwork.</p><p>The project was devised by robotics Ph.D. student Gerry Chen, in collaboration with Juan-Diego Florez, a fellow graduate student;&nbsp;<a href="https://www.cc.gatech.edu/people/frank-dellaert">Frank Dellaert</a>, robotics professor in the&nbsp;<a href="https://www.ic.gatech.edu/">School of Interactive Computing</a>, and&nbsp;<a href="https://faculty.cc.gatech.edu/~seth/">Seth Hutchinson</a>, professor and KUKA Chair for Robotics. The team&rsquo;s&nbsp;<a href="https://ieeexplore.ieee.org/document/9812008">peer-reviewed study of the robot system</a>&nbsp;was published in the International Conference on Robotics and Automation proceedings for&nbsp;2022.</p><p><strong>How It Works</strong></p><p>For a robot to be able to paint in a human style, both the robot and the art must be designed with the other in mind &mdash; at least for now. The GTGraffiti system consists of three stages: artwork capture, robot hardware, and planning and control.</p><p>First, the team uses motion capture technology to record human artists painting &mdash; a strategy that allows for insight into the types of motions required to produce spray-painted artwork. For this study, Chen and the team invited two artists to paint the alphabet in a bubble letter graffiti style. As each artist painted, they recorded the motions of the artist&rsquo;s hand across the canvas, as well as the movements of the spray paint can itself. Capturing hand and spray paint can trajectories is crucial for the robot to be able to paint using similar layering, composition, and motion as those of a human artist.</p><p>The team then processed the data to analyze each motion for speed, acceleration, and size, and used that information for the next stage &mdash; designing the robot. Taking these data into consideration, as well as portability and accuracy required for the artwork, they chose to use a cable-driven robot. Cable-driven robots, like the Skycams used in sports stadiums for aerial camera shots, are notable for being able to scale to large sizes. The robot runs on a system of cables, motors, and pulleys. The team&rsquo;s robot is currently mounted on a 9 by 10-foot-tall steel frame, but Chen says it should be possible to mount it directly onto a flat structure of almost any size, such as the side of a building.&nbsp;</p><p>For the third stage, the artist&rsquo;s composition is converted into electrical signals. Taken together, the figures form a library of digital characters, which can be programmed in any size, perspective, and combination to produce words for the robot to paint. A human artist chooses shapes from the library and uses them to compose a piece of art. For this study, the team chose to paint the letters &ldquo;ATL.&rdquo;</p><p>Once the team chooses a sequence and position of characters, they use mathematical equations to generate trajectories for the robot to follow. These algorithmically produced pathways ensure that the robot paints with the correct speed, location, orientation, and perspective. Finally, the pathways are converted into motor commands to be executed. &nbsp;</p><p>With all the computing and competing movements, the motors on the robot could potentially work against each other, threatening to rip the robot apart. To address this, the central robot controller is programmed to recalculate motor commands 1,000 times per second so that the robot can function safely and reliably. Once assembled, the robot can then paint an artwork in the style of a human graffiti artist.</p><p><strong>Why Art? Why Graffiti?</strong></p><p>Some of the most typical industries for robotics applications include manufacturing, biomedicine, automobiles, agriculture, and the military.&nbsp;But the arts, it turns out, can showcase robotics in an especially powerful way.</p><p>&ldquo;The arts, especially painting or dancing, exemplify some of the most complex and nuanced motions humans can make,&rdquo; Chen said. &ldquo;So if we want to create robots that can do the highly technical things that humans do, then creating robots that can&nbsp;<a href="https://www.youtube.com/watch?v=2KRZHxIyrI0">dance</a>&nbsp;or paint are great goals to shoot for. These are the types of skills that demonstrate the extraordinary capabilities of robots and can also be applied to a variety of other applications.&rdquo; &nbsp;</p><p>On a personal level, Chen is motivated by his hope for people to perceive robots as being helpful to humanity, rather than seeing them as job-stealers or entities that cause feelings of fear, sadness, or doom as often depicted in film.</p><p>&ldquo;Graffiti is an art form that is inherently meant to be seen by the masses,&rdquo; Chen said. &ldquo;In that respect, I feel hopeful that we can use graffiti to communicate this idea &mdash; that robots working together with humans can make positive contributions to society.&rdquo;</p><p><strong>Future Directions</strong></p><p>Presently, Chen and the team&rsquo;s plans for the robot are centered around two main thrusts: preserving and amplifying art. To this end, they are currently experimenting with reproducing pre-recorded shapes at different scales and testing the robot&rsquo;s ability to paint larger surfaces. These abilities would enable the robot to paint scaled up versions of original works in different geographical locations and for artists physically unable to engage in onsite spray painting. In theory, an artist would be able to paint an artwork in one part of the world, and a GTGraffiti bot could execute that artwork in another place.</p><p>In the future, Chen hopes to use GTGraffiti to capture artists painting graffiti in the wild. With the captured motion data, GTGraffiti would be able to reproduce the artwork were it ever painted over or destroyed.</p><p>&ldquo;The robot is not generating the art itself, but rather working together with the human artist to enable them to achieve more than they could without the robot,&rdquo; Chen said.</p><p>Chen envisions that the robot system will eventually have capabilities that allow for real-time artist-robot interaction. He hopes to develop the technology that could enable an artist standing at the foot of a building to spray paint graffiti in a small space while the cable-driven robot copies the painting with giant strokes on the side of the building, for example.</p><p>&ldquo;We hope that our research can help artists compose artwork that, executed by a superhuman robot, communicates messages more powerfully than any piece they could have physically painted themselves,&rdquo; said Chen.</p><p>GTGraffiti is funded by a National Science Foundation grant that supports research involving human-robot collaboration in artistic endeavors.</p><p>&nbsp;</p><p><strong>Citation</strong>: G. Chen, S. Baek, J.-D. Florez, W. Qian, S.-W. Leigh, S. Hutchinson, and F. Dellaert, &ldquo;GTGraffiti: Spray painting graffiti art from human painting motions with a cable driven parallel robot,&rdquo; in&nbsp;<em>2022 IEEE International Conference on Robotics and Automation (ICRA)</em>, 2022.</p><p><strong>DOI</strong>:&nbsp;<a href="https://doi.org/10.48550/arXiv.2109.06238">https://doi.org/</a><a href="https://doi.org/10.1109/ICRA46639.2022.9812008" target="_blank" title="https://doi.org/10.1109/ICRA46639.2022.9812008">10.1109/ICRA46639.2022.9812008</a></p><p><strong>Writer</strong>: Catherine Barzler</p><p><strong>Video:&nbsp;</strong>Kevin Beasley</p><p><strong>Photography</strong>: Rob Felt</p><p><strong>Media Contact</strong>: Catherine Barzler | catherine.barzler@gatech.edu</p>]]></body>  <author>Catherine Barzler</author>  <status>1</status>  <created>1655397747</created>  <gmt_created>2022-06-16 16:42:27</gmt_created>  <changed>1661173963</changed>  <gmt_changed>2022-08-22 13:12:43</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Graduate students at the Georgia Institute of Technology have built the first graffiti-painting robot system that mimics the fluidity of human movement. ]]></teaser>  <type>news</type>  <sentence><![CDATA[Graduate students at the Georgia Institute of Technology have built the first graffiti-painting robot system that mimics the fluidity of human movement. ]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2022-06-16T00:00:00-04:00</dateline>  <iso_dateline>2022-06-16T00:00:00-04:00</iso_dateline>  <gmt_dateline>2022-06-16 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[catherine.barzler@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Catherine Barzler,&nbsp;Senior Research Writer/Editor</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>658952</item>          <item>658949</item>          <item>658950</item>          <item>658951</item>      </media>  <hg_media>          <item>          <nid>658952</nid>          <type>image</type>          <title><![CDATA[GTGraffiti finished]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[22C5001-P2-006.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/22C5001-P2-006.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/22C5001-P2-006.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/22C5001-P2-006.jpg?itok=oC-cbbaz]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1655397021</created>          <gmt_created>2022-06-16 16:30:21</gmt_created>          <changed>1655397021</changed>          <gmt_changed>2022-06-16 16:30:21</gmt_changed>      </item>          <item>          <nid>658949</nid>          <type>image</type>          <title><![CDATA[GTGraffiti2 spray can]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Screen Shot 2022-06-14 at 4.41.37 PM.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Screen%20Shot%202022-06-14%20at%204.41.37%20PM.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Screen%20Shot%202022-06-14%20at%204.41.37%20PM.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Screen%2520Shot%25202022-06-14%2520at%25204.41.37%2520PM.png?itok=9QUcKwE4]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1655396854</created>          <gmt_created>2022-06-16 16:27:34</gmt_created>          <changed>1655396854</changed>          <gmt_changed>2022-06-16 16:27:34</gmt_changed>      </item>          <item>          <nid>658950</nid>          <type>image</type>          <title><![CDATA[GTGraffiti artist]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[robots_student_spray_painting.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/robots_student_spray_painting.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/robots_student_spray_painting.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/robots_student_spray_painting.png?itok=EgG9qHc_]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1655396904</created>          <gmt_created>2022-06-16 16:28:24</gmt_created>          <changed>1655396904</changed>          <gmt_changed>2022-06-16 16:28:24</gmt_changed>      </item>          <item>          <nid>658951</nid>          <type>image</type>          <title><![CDATA[GTGraffiti mocap]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[robots_showing_fingertip_sensors.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/robots_showing_fingertip_sensors.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/robots_showing_fingertip_sensors.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/robots_showing_fingertip_sensors.png?itok=2maAAGAX]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1655396944</created>          <gmt_created>2022-06-16 16:29:04</gmt_created>          <changed>1655396944</changed>          <gmt_changed>2022-06-16 16:29:04</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>          <topic tid="71901"><![CDATA[Society and Culture]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="658858">  <title><![CDATA[Amazon Robotics Gift Supports Georgia Tech’s Advanced Technology Development Center]]></title>  <uid>28137</uid>  <body><![CDATA[<p>To help&nbsp;support the growth of startups and individuals working to advance automation and robotics,&nbsp;<a href="https://www.amazon.science/research-areas/robotics">Amazon Robotics</a>&nbsp;today announced it is providing a substantial investment over three years to&nbsp;the Georgia Institute of Technology&rsquo;s&nbsp;<a href="https://innovate.gatech.edu/programs-old/advanced-technology-development-center-atdc/">Advanced Technology Development Center</a>&nbsp;(ATDC).</p><p>ATDC is Georgia&rsquo;s technology startup incubator and helps entrepreneurs across the state build, launch, and scale successful companies.&nbsp;The goal of the gift is to accelerate growth of automation and robotics by leveraging staff and resources at ATDC in collaboration with Amazon.</p><p>&ldquo;Our mission is to support infrastructure for startups and to help foster compelling startup companies with tremendous talent that solve big problems,&rdquo; said Thomas Felis, director of robotics strategy for Amazon Global Robotics. &ldquo;Equally important to us is Georgia Tech&rsquo;s track record of working with and supporting entrepreneurs from diverse and underrepresented backgrounds.&rdquo;</p><p>The funding includes allocation for an ATDC full-time automation and robotics catalyst to recruit and coach companies focused on automation and robotics. The catalyst will identify relevant startups and help onboard them into ATDC&rsquo;s startup pipeline and portfolio.</p><p>&ldquo;Georgia Tech is a leader in robotics research, and we are excited to have Amazon support our startup mission at ATDC to bring entrepreneurial ideas to life and to market,&rdquo; said John Avery, ATDC director. &ldquo;Innovation can come from anywhere and everywhere, and this collaboration reflects our commitment to support diverse startup founders.&rdquo;</p><p>This effort will also support Georgia Tech&rsquo;s ongoing robotics research, including the&nbsp;<a href="https://research.gatech.edu/robotics/robotics-industry-program">Institute for Robotics and Intelligent Machines</a>.</p><p>The Amazon sponsorship expands ATDC&rsquo;s targeted vertical focus areas to seven, including financial, health, and retail technology, 5G, logistics and supply chain, and advanced manufacturing.</p><p>ATDC will also work with Amazon to identify specific areas of technical interest with the aim of developing virtual and physical events to attract relevant startups.</p><p>To apply to join the robotics and automation incubator, click&nbsp;<a href="https://atdc.org/application-for-entrepreneurs-seeking-to-join-the-atdc-robotics-and-automation-vertical/">here</a>.</p>]]></body>  <author>Péralte Paul</author>  <status>1</status>  <created>1655225502</created>  <gmt_created>2022-06-14 16:51:42</gmt_created>  <changed>1655226195</changed>  <gmt_changed>2022-06-14 17:03:15</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Funding will go toward assisting diverse entrepreneurs in the fields of robotics and automation.]]></teaser>  <type>news</type>  <sentence><![CDATA[Funding will go toward assisting diverse entrepreneurs in the fields of robotics and automation.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2022-06-14T00:00:00-04:00</dateline>  <iso_dateline>2022-06-14T00:00:00-04:00</iso_dateline>  <gmt_dateline>2022-06-14 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[peralte@atdc.org]]></email>  <location></location>  <contact><![CDATA[<p><strong>Peralte C. Paul</strong><br />peralte.paul@comm.gatech.edu<br />404.316.1210</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>658859</item>      </media>  <hg_media>          <item>          <nid>658859</nid>          <type>image</type>          <title><![CDATA[Avery and Felis]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[John and Thomas-1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/John%20and%20Thomas-1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/John%20and%20Thomas-1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/John%2520and%2520Thomas-1.jpg?itok=DHWJEXtO]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Shot of John Avery and Thomas Felis]]></image_alt>                    <created>1655225992</created>          <gmt_created>2022-06-14 16:59:52</gmt_created>          <changed>1655225992</changed>          <gmt_changed>2022-06-14 16:59:52</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="139"><![CDATA[Business]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="139"><![CDATA[Business]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="81501"><![CDATA[Amazon]]></keyword>          <keyword tid="4238"><![CDATA[atdc]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="6503"><![CDATA[automation]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="106361"><![CDATA[Business and Economic Development]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="658195">  <title><![CDATA[Faces of Research: Meet Kinsey Herrin]]></title>  <uid>28137</uid>  <body><![CDATA[<p><em>The <a href="https://research.gatech.edu/robotics">Institute for Robotics and Intelligent Machines</a> at Georgia Tech supports and facilitates the operation of several core research facilities on campus. This allows&nbsp;faculty, students, and collaborators to advance the boundaries&nbsp;of robotics research.</em></p><p><em>This installment of the Faces of Research Q&amp;A series is&nbsp;with <a href="https://www.me.gatech.edu/faculty/herrin">Kinsey Herrin</a></em></p><p><strong>What is your field of expertise and why did you choose it?</strong><br />I&rsquo;m a prosthetist/orthotist and conduct research in the field of prosthetics, orthotics/exoskeletons, and rehab robotics. Our goal is to make it easier for people with mobility challenges to live more independent lives by helping them move more easily in the real world. The change we see through our technology sometimes is amazing &mdash;&nbsp;people with amputations can go upstairs, step-over-step instead of stiff legged, and kids with walking disabilities&nbsp;start to have more normal walking patterns. As a kid, I always wanted to help people and this profession is the perfect blend of medicine, science, and art &mdash;&nbsp;all things that I love plus the added benefit of getting to be around some really incredible people.</p><p><strong>What makes Georgia Tech Research institutes unique?</strong><br />We&rsquo;re trying to advance technology outside of the lab and into the real world where it can make an impact on real users. That means not only assessing how our users perform with the technology &mdash;&nbsp;does it actually make them walk faster, with&nbsp;a more natural and easy gait &mdash;&nbsp;but also assessing a user&rsquo;s own perspective on technology and using all of that data to keep improving the end results. Our facilities and resources are incredible. I often feel I have access to a dream playground for a research prosthetist/orthotist. On top of all of that, our faculty and students are not only extremely talented and at the top of their fields, but I think there is a deeper passion for pursuing this goal to make mobility easier for people with physical challenges.</p><p><strong>What impact is your research having on the world? </strong><br />I see our work as having an impact on all people with mobility challenges. We are trying to make the world a better place for them by challenging the status quo and saying what clinicians can currently provide is still not good enough. We can still do more to return people to a new normal after amputation, stroke, brain, and spinal cord injuries. When people can access their own environment independently, it has overwhelmingly positive impacts on their quality of life. I think our research is making great strides toward making that possible.&nbsp;</p><p><strong>What do you like to do in your spare time when you are not working on your research or teaching?</strong><br />I enjoy being outdoors with my husband and son any chance we get. We love pretty much everything about being on or near water &mdash;&nbsp;fishing, kayaking, canoeing, swimming, and camping. I also have nine&nbsp;backyard chickens and a dog that are hilarious and fun additions to the Herrin chaos.</p>]]></body>  <author>Péralte Paul</author>  <status>1</status>  <created>1652452905</created>  <gmt_created>2022-05-13 14:41:45</gmt_created>  <changed>1652453432</changed>  <gmt_changed>2022-05-13 14:50:32</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Senior Research Scientist, Woodruff School of Mechanical Engineering]]></teaser>  <type>news</type>  <sentence><![CDATA[Senior Research Scientist, Woodruff School of Mechanical Engineering]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2022-05-13T00:00:00-04:00</dateline>  <iso_dateline>2022-05-13T00:00:00-04:00</iso_dateline>  <gmt_dateline>2022-05-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[peralte.paul@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>658196</item>      </media>  <hg_media>          <item>          <nid>658196</nid>          <type>image</type>          <title><![CDATA[FoR: Kinsey Herrin]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Faces-of-Research-banner_Herrin-title_01.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Faces-of-Research-banner_Herrin-title_01.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Faces-of-Research-banner_Herrin-title_01.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Faces-of-Research-banner_Herrin-title_01.jpg?itok=WYCYRtLg]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[headshot of Kinsey Herrin]]></image_alt>                    <created>1652453346</created>          <gmt_created>2022-05-13 14:49:06</gmt_created>          <changed>1652453346</changed>          <gmt_changed>2022-05-13 14:49:06</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1237"><![CDATA[College of Engineering]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="657308">  <title><![CDATA[New “Micro-rocker” Bots Are Powered by a Single Electromagnetic Coil]]></title>  <uid>36172</uid>  <body><![CDATA[<p>Georgia Tech researchers have shown that robots about the size of a particle of dust are capable of precise bidirectional control. By harnessing the power of a magnetic field generated by only a single electromagnetic coil, the mobile micro-robots are the smallest of their type.</p><p>&ldquo;There are swimmer micro-robots that move in a fluid with similar size, but these are the smallest &lsquo;walking&rsquo; robots that move on a solid surface,&rdquo; said&nbsp;<a href="https://www.ece.gatech.edu/faculty-staff-directory/azadeh-ansari">Azadeh Ansari</a>, the Sutterfield Family Early Career Assistant Professor at Georgia Tech School of Electrical and Computer Engineering (ECE).</p><p>The Georgia Tech study was recently published in the&nbsp;<a href="https://link.springer.com/epdf/10.1007/s12213-022-00149-y?sharing_token=6BaiN27mwVkc99vtLSaG3fe4RwlQNchNByi7wbcMAY534Rn_nre52BTa_Z7xlrh6cyolUy9n466Ww7Qz2L30gRo5MLOf7TBMAB6zPtlJr0xHOf1Eu7bqaTbyxfNqz_VCR-ISucKah5fzGAh5bcWtDYPmB-Y66VctYdo7WQA39L4%3D">Journal of Micro-Bio Robotics</a>. Currently, most magnetically-actuated micro-bot systems rely on adding multiple electromagnets to enable full control, resulting in higher power consumption and less flexible setups. Being able to demonstrate that a single coil setup is enough for precise bidirectional motion control is a significant hurdle to clear, according to Ansari. With the micro-bots now much easier to operate, the team has been able to demonstrate micromanipulation capabilities.</p><p>&ldquo;With what we&rsquo;ve shown, we can already think of applying the micro-bots in a lab setting,&rdquo; said Ansari. &ldquo;You could have hundreds of robots on the same substrate working akin to ants in a colony.&rdquo;</p><p>In Spring 2019, Ansari&rsquo;s team showcased larger (two millimeters long)&nbsp;<a href="https://rh.gatech.edu/news/623453/tiny-vibration-powered-robots-are-size-worlds-smallest-ant">&ldquo;micro-bristle-bots&rdquo;</a>&nbsp;that could move by harnessing vibrations. Vibrations are no longer needed to move the micro-bots because of their updated &ldquo;rocker&rdquo; design &mdash; hence micro-rocker bots. The new design allows the bots to move by performing a stick&ndash;slip motion with an out-of-plane magnetic field.</p><p>Stick-slip motion basically refers to the two states of the robot; one when the robot is in a pinned/stationary position on the surface and the other when the robot &ldquo;slips&rdquo; slightly in one direction and achieves net motion, according to Ph.D. student Tony Wang. When the magnetic field is turned on, the robot will essentially rise and then fall. This motion enables enough kinetic energy to allow the robot to move.</p><p><strong>More Than a New Design</strong></p><p>Equally as important as the rocker design, the paper demonstrates the novel use of a waveform offset for biasing the direction of the robot&#39;s trajectory. The sign of the magnetic field offset (positive or negative), as well as the rocker&rsquo;s angle with the surface, is what determines the direction the micro-bots will travel. Combined, the rocker design and the magnetic offset make the micro-bots capable of well-controlled, and importantly selectable, movement. The acceleration and deceleration of the micro-rocker bots can further be controlled by changing the frequency of the magnetic field.</p><p>The 100-micrometre long micro-bots were 3D printed on to a glass substrate via two-photon lithography and subsequently deposited with a nickel thin film, which acts as a semi-hard magnet in response to external magnetic fields. For many lab applications the robots can be directly printed on the substrate that will go under the microscope, but they can also be printed and transported with a micropipette.</p><p>&ldquo;There are lot of areas the micro-robots can be applied to within the current 2D, under-the-microscope process we&rsquo;ve established so far,&rdquo; said Ansari. &ldquo;But there&rsquo;s also a future where they can be injected into living organisms to deliver drugs or repair injuries.&rdquo;&nbsp;</p><p>The team is currently working to equip a micro-bot with a tip that could potentially insert nanoparticles into biological tissue for drug delivery or DNA extraction. Their findings will be presented at the&nbsp;Hilton Head Workshop 2022: A Solid-State Sensors, Actuators and Microsystems Workshop this June.</p><p>****</p><p><strong>Citation:&nbsp;</strong>Tony Wang, DeaGyu Kim, Yifan Shi, and Zhijian Hao, Azadeh Ansari &ldquo;Bidirectional microscale rocker robots controlled via neutral position offset&rdquo; (Journal of Micro-Bio Robotics, 2022).&nbsp;&nbsp;<a href="https://doi.org/10.1007/s12213-022-00149-y">https://doi.org/10.1007/s12213-022-00149-y</a></p><p><strong>Funding:</strong>&nbsp;This work is supported by Georgia Tech Institute for Electronics and Nanotechnology (IEN) and the National Science Foundation Graduate Research Fellowship under Grant No. DGE-1650044. The device fabrication was performed at the Georgia Tech Institute for Electronics and Nanotechnology clean room facilities, a member of the National Nanotechnology Coordinated Infrastructure (NNCI), which is supported by the National Science Foundation (Grant ECCS-1542174).&nbsp;</p>]]></body>  <author>dwatson71</author>  <status>1</status>  <created>1649966959</created>  <gmt_created>2022-04-14 20:09:19</gmt_created>  <changed>1650374227</changed>  <gmt_changed>2022-04-19 13:17:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Once the size of ants, these Georgia Tech 3D-printed micro-robots can now only be seen under a microscope.]]></teaser>  <type>news</type>  <sentence><![CDATA[Once the size of ants, these Georgia Tech 3D-printed micro-robots can now only be seen under a microscope.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2022-04-14T00:00:00-04:00</dateline>  <iso_dateline>2022-04-14T00:00:00-04:00</iso_dateline>  <gmt_dateline>2022-04-14 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[dwatson@ece.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Dan Watson</strong><br /><a href="http://dwatson@ece.gatech.edu">dwatson@ece.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>657353</item>          <item>657355</item>      </media>  <hg_media>          <item>          <nid>657353</nid>          <type>image</type>          <title><![CDATA[Azadeh Ansari, Georgia Tech Assistant Professor in the School of Electrical and Computer Engineering]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Azadeha.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Azadeha.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Azadeha.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Azadeha.jpeg?itok=L4DQZITZ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1650044663</created>          <gmt_created>2022-04-15 17:44:23</gmt_created>          <changed>1650044663</changed>          <gmt_changed>2022-04-15 17:44:23</gmt_changed>      </item>          <item>          <nid>657355</nid>          <type>image</type>          <title><![CDATA[Azadeh Ansari in the lab]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[19C10200-P46-010.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/19C10200-P46-010.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/19C10200-P46-010.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/19C10200-P46-010.jpg?itok=4UvFH-w1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1650045275</created>          <gmt_created>2022-04-15 17:54:35</gmt_created>          <changed>1650045275</changed>          <gmt_changed>2022-04-15 17:54:35</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.ece.gatech.edu/faculty-staff-directory/azadeh-ansari]]></url>        <title><![CDATA[Azadeh Ansari ]]></title>      </link>          <link>        <url><![CDATA[https://www.ece.gatech.edu]]></url>        <title><![CDATA[ECE]]></title>      </link>          <link>        <url><![CDATA[https://rdcu.be/cJvPH]]></url>        <title><![CDATA[Journal of Micro-Bio Robotics ]]></title>      </link>          <link>        <url><![CDATA[https://rh.gatech.edu/news/623453/tiny-vibration-powered-robots-are-size-worlds-smallest-ant]]></url>        <title><![CDATA[Micro-bristle-Bot, 2019]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1255"><![CDATA[School of Electrical and Computer Engineering]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="175301"><![CDATA[Azadeh Ansari]]></keyword>          <keyword tid="190376"><![CDATA[micro-rocker bots]]></keyword>          <keyword tid="2435"><![CDATA[ECE]]></keyword>          <keyword tid="190377"><![CDATA[3D-printing]]></keyword>          <keyword tid="190378"><![CDATA[stick-slip motion]]></keyword>          <keyword tid="1163"><![CDATA[microsystems]]></keyword>          <keyword tid="190379"><![CDATA[electromagnetic coil]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="657151">  <title><![CDATA[From Virtual Reality to Ice Slurries: How ATRP is Impacting Georgia, the Nation, and World]]></title>  <uid>35832</uid>  <body><![CDATA[<div><div><div><div><h3>In the age of Covid-19, the need for industries to adopt advanced technologies, incorporate more health and safety standards into their daily operations, and maintain a robust workforce is more important than ever.</h3><p>The Agricultural Technology Research Program (ATRP) at the Georgia Tech Research Institute (GTRI) is leading efforts to equip Georgia&#39;s agribusiness and food processing industries with the technology and skills to remain competitive and at the forefront of the global transformation that has been accelerated by the pandemic. ATRP works in collaboration with university and industry partners, especially within Georgia&#39;s poultry industry &ndash; which has a <strong><a href="https://www.agr.georgia.gov/poultry-emergency-rule-notice.aspx#:~:text=Furthermore%2C%20Georgia’s%20poultry%20industry%20has%20a%20%2428%20billion,major%20and%20real%20threat%20to%20Georgia’s%20public%20welfare.">$28 billion</a></strong> annual impact on the Georgia economy &ndash; on projects involving robotics, advanced sensors, environmental treatment, and worker and food safety technologies. ATRP&#39;s ultimate goal is to transition technologies from concept to commercialization as quickly and economically as possible.</p><p>&quot;Our role is to support the agriculture industry in the state of Georgia and the world &ndash; we are Georgia-focused first, but what we do in Georgia is going to impact the world,&quot; said Doug Britton, a GTRI principal research engineer and ATRP program manager.</p><div><div><div><div><div><div><h2>Problem Solved</h2><p>The ATRP&#39;s origins date back to 1973, when the Georgia Poultry Federation requested engineering support from GTRI and Georgia Tech on issues troubling the poultry industry. The Georgia Poultry Federation represents the Georgia poultry industry&#39;s interests at the state and federal levels on legislative and regulatory issues.</p><p>&quot;They received concerns from neighbors and friends about all of the noise coming out of mills used to make animal feeds,&quot; Britton said. &quot;So, they asked Georgia Tech to do acoustics analysis to see if there was some way to reduce those noise levels.&quot;</p><p>From there, ATRP was born.</p><p>ATRP conducts state-sponsored and contract research for industry and government agencies. For FY 21, ATRP received roughly $2 million in funding from the state of Georgia. ATRP&#39;s Automation and Robotics Research received the majority of that funding, at 41%, followed by Technology Transfer/Outreach/Technical Assistance, which received 16%. Environmental and Biological Systems Research came in third at 14%; followed by Food Safety Research at 13%; Program Support with 9%; and Imaging and Sensor Research at 7%.</p><p>ATRP benefits from significant industry support, with 15 companies and associations actively participating in research projects. In addition, over 35 individuals sit on the ATRP industry advisory committee, representing 28 different companies and organizations. For FY 2021, ATRP had nine research prototypes in various stages of development; five exploratory research projects; three provisional patent applications; five invention disclosures; 33 published articles, papers, and presentations; 18 participating industry and academic partners; and 21 technical assistance service requests fulfilled.</p><div><div><div><div><h2>VR to the Rescue</h2><p>One of ATRP&#39;s provisional patents relates to its work around incorporating automation solutions, specifically virtual reality (VR), into poultry processing to boost efficiency and enhance worker safety.</p><p>Working in a poultry processing plant can be challenging.</p><p>Food processing environments are often kept quite cold by design to prevent pathogen growth. Low temperatures, combined with the physical demands of the job, have contributed to the industry&rsquo;s high turnover rates that have been exacerbated by the pandemic. According to recent <strong><a href="http://www.ncfh.org/poultry-workers.html">estimates</a></strong>, poultry worker turnover ranges from 40% to 100% annually, and amid Covid-19, increased risks for disease transmission and cross-contamination pose even more obstacles for the sector.</p><p>To address these issues, ATRP is exploring ways to combine VR with factory-based robotics in certain poultry processing operations, such as cone loading, which could allow workers to perform their jobs in safer environments &ndash; or even from home. Cone loading is when chicken carcasses that have had their legs and thighs removed are placed onto a cone for further processing.</p><p>&quot;Cone loading sounds like a really easy task, and it is,&quot; said Konrad Ahlin, a GTRI research engineer who has expertise in robotics. &quot;But the problem is having a dedicated person doing that for extended periods &ndash; it&#39;s physically demanding on the person, and it&#39;s a menial, trivial task that&#39;s unfortunately just necessary.&quot;</p><div><div><div><div><p>ATRP&#39;s &quot;expert-in-the-loop&quot; robotics solution would allow human workers to provide key information to robot systems performing the operation &ndash; all from a virtual reality environment. So far, attempts to fully automate common poultry processing operations have not been successful due to chickens&#39; irregular and malleable shapes. But VR could solve that challenge, Ahlin noted.</p><p>&quot;Virtual reality is creating this bridge where information can intuitively pass between human operators and robotic devices in a way that hasn&#39;t been possible before,&quot; Ahlin said.</p><p>ATRP has filed a provisional patent for its VR research and is also working with the Georgia Research Alliance (GRA) to develop a commercialization roadmap for the technology. The GRA is an Atlanta-based nonprofit that expands research capacity at Georgia universities, then seeds and shapes startup companies around inventions and discoveries.</p><p>Gary McMurray, a GTRI principal research engineer and division chief of GTRI&#39;s Intelligent Sustainable Technologies Division, said VR&#39;s potential to defy geographic limitations could be transformative for the manufacturing industry at large.</p><p>&quot;There are lots of reasons that this technology could have a big impact on manufacturing, which is struggling with finding people to do jobs,&quot; McMurray said. &quot;With this technology, you could be sitting in West Virginia, put on a VR headset, and work from the comfort of your home. You&#39;re no longer tied to geography, and that&#39;s really powerful.&quot;</p><div><div><div><div><div><div><h2>Concept to Commercialization</h2><p>Many ATRP projects are already having an impact outside the lab.</p><h4>Interferometric Biosensing</h4><p>One of those is an interferometric biosensor that can be configured to rapidly detect a variety of pathogens and chemicals across multiple industries. The technology has been licensed exclusively to Valdosta, Ga.-based Salvus&trade;, which is a part of the CJB&reg; family of companies.</p><p>Salvus, which develops and manufactures chemical contaminant and pathogen detection technologies for the food and agriculture, life sciences, water quality, and specialty chemical industries, has said it expects to begin clinical and market trials for the biosensor sometime in 2022.</p><p>&ldquo;We have been able to apply our commercialization and manufacturing experience to the breakthrough work that Dr. Xu and the ATRP team have accomplished,&quot; said Ron Levin, director of strategy for Salvus. &quot;It is rewarding to discuss this technology with potential commercial partners and to hear their excitement for the technology in so many potential applications.&rdquo;</p><p>Jie Xu, a GTRI principal research scientist who is leading the Salvus project, said her team is currently working with Salvus to ensure the technology&#39;s core electrical, mechanical, and chemical functions perform seamlessly ahead of deployment.</p><p>&quot;In a controlled environment, such as a lab, the technology works beautifully,&quot; Xu said. &quot;But when you move it outside of the lab, you have to account for a lot of unknown factors.&quot;</p><p>The science behind the Salvus detection system is called interferometry, which exploits the interference of light waves to precisely determine the rate at which target particles attach to the sensor&#39;s surface. The sensor contains two separate channels &ndash; a sensing channel and a reference channel.</p><p>The sensing channel detects the specific target of interest, such as a virus or chemical. This signal is then compared to the reference channel, which allows the sensor to quantify the level or amount of the specific target and provide an accurate reading. A major benefit of the technology is its ability to complete tests in a matter of minutes or seconds. In a medical setting, a device utilizing this technology would allow clinicians to process a patient sample and have results ready before the patient leaves the premises &ndash; eliminating the need to send patients home to await lab testing results. Meanwhile, at a water processing facility, workers would be able to use this device to test the water and immediately know how much treatment is required.</p><p>The technology has been tested in more than 50 diverse applications, including the detection of Covid-19, Salmonella, avian influenza, and many different chemicals.</p><p>&quot;Salvus came to us and asked if we could research ways to speed up in-the-field testing of pathogens and chemical contaminants,&quot; Xu said. &quot; So, instead of a company sending a sample to the lab and waiting weeks to get the lab results, our proprietary technology would produce results right on the spot.&quot;</p><h4>Dynamic Filtration</h4><p>Another project making headway in the commercial space is ATRP&#39;s Dynamic Filtration System. ATRP conducted in-plant trials of the patented filtration technology in FY 2021. During the trials, the system was licensed for poultry processing by Watson Agriculture and Food, a subsidiary of venture capital firm Watson Holdings that invests in technologies to solve major world problems.</p><p>The technology consists of a unique filtration system that is designed to separate various levels of solids, fats, and other materials from wastewater used in poultry processing. The process keeps the filtered materials from clogging the system, allowing for greater throughput. In addition, it captures the filtered materials that have additional value as a byproduct. Current work is focused on screening smaller particles to further improve water recyclability and wastewater treatment in poultry processing.</p><p>&quot;Universities are excellent at finding problems to solve, and I chose to partner with Georgia Tech for its reputation as being a leading research institute that has some of the best engineers in the world,&quot; said Trey Watson, founder and CEO of Watson Holdings. &quot;Even though water filtration is just one component of the poultry production process, it greatly enhances consumer safety and is both cost- and energy-efficient for the poultry industry, and I am excited to continue working with Georgia Tech and GTRI as we create additional solutions for tomorrow&#39;s problems.&quot;</p><div><div><div><div><h2>Ice in Motion</h2><p>Nearly everyone remembers the Slurpee slushies found at their local 7-Elevens.</p><p>ATRP researchers are applying a similar concept, along with rotational kinematics, to poultry processing to ensure product quality and safety. A distinction between ATRP&#39;s ice slurry mixture and Slurpee slushies is that the ice particles ATRP uses are finer, and its ice slurry blend is more homogenous.</p><p>During processing, chicken carcasses are typically immersed in screw augers of chilled water to lower their temperature to a degree that prevents pathogen growth. A screw auger is a mechanism used in bulk handling that utilizes a rotating helical screw blade to move liquid or granular materials through a shaft. However, this process requires carcasses to be removed from a shackle line before immersion, which can result in lost traceability; increased cross-contamination risks due to direct contact between carcasses; and additional labor to rehang the carcasses onto processing line shackles after chilling.</p><p>ATRP is working to solve these challenges by keeping carcasses shackled during the immersive chilling process. For the chilling medium, ATRP is using either conventional chilled water or ice slurry, which is a mix of tiny ice crystals and liquid water. Compared to conventional liquid water, ice slurry provides the additional chilling effects of ice while retaining a liquid-like form that is easily transportable and could result in higher cooling rates.</p><p>&quot;One thing about ice slurry is that you can pump it like a liquid instead of trying to load it and carry it around like traditional ice,&quot; said Comas Haynes, a GTRI principal research engineer who is leading the project. &quot;And because of its liquid nature, it can really go around the contour of the carcasses, which results in faster chilling times.&quot;</p><p>The team has built a new carousel-type test rig that better mimics real-world conditions, wherein the carcasses remain shackled during immersive chilling to alleviate the aforementioned screw conveyor issues. The addition of passive, or non-motorized, rotational effects along with conventional &ldquo;line speed&rdquo; translation is producing promising reductions in chill time. This has already been shown for chilled water, and there is a near-term plan to test this enhancement in ice slurry as well, Haynes said. Georgia Tech has also filed a patent on its rotational chilling enhancement research.</p><div><div><div><div><div><div><h2>Failing Forward</h2><p>Whatever the project, ATRP continuously seeks to translate novel research concepts into commercially viable products for poultry, agribusiness, and food manufacturing industries in Georgia, the nation, and the world, that maximize productivity and efficiency, advance safety and health, and minimize environmental impacts.</p><p>&quot;We want to be viewed as thinking outside the box &ndash; that&#39;s part of our role, and we embrace it,&quot; Britton said. Britton added that GTRI provides industry partners with a safe environment to take risks and the cutting-edge technologies to achieve maximum success.</p><p>&quot;I always tell my industry stakeholders that GTRI is a great place to fail,&quot; Britton said. &quot;If we fail here, it means we don&#39;t fail out there.&quot;</p></div></div></div></div></div></div><div><div><div><div><div><div><p><br />Writer: <a href="mailto:anna.akins@gtri.gatech.edu" target="_blank">Anna Akins</a><br />Photos: Christopher Moore<br />&nbsp;</p><p>The <strong><a href="https://gtri.gatech.edu">Georgia Tech Research Institute (GTRI)</a></strong> is the nonprofit, applied research division of the Georgia Institute of Technology (Georgia Tech). Founded in 1934 as the Engineering Experiment Station, GTRI has grown to more than 2,800 employees supporting eight laboratories in over 20 locations around the country and performing more than $700 million of problem-solving research annually for government and industry. GTRI&#39;s renowned researchers combine science, engineering, economics, policy, and technical expertise to solve complex problems for the U.S. federal government, state, and industry.</p></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div>]]></body>  <author>Michelle Gowdy</author>  <status>1</status>  <created>1649688495</created>  <gmt_created>2022-04-11 14:48:15</gmt_created>  <changed>1649689011</changed>  <gmt_changed>2022-04-11 14:56:51</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The Agricultural Technology Research Program (ATRP) at the Georgia Tech Research Institute (GTRI) is leading efforts to equip Georgia's agribusiness and food processing industries to remain competitive and at the forefront of the global transformation.]]></teaser>  <type>news</type>  <sentence><![CDATA[The Agricultural Technology Research Program (ATRP) at the Georgia Tech Research Institute (GTRI) is leading efforts to equip Georgia's agribusiness and food processing industries to remain competitive and at the forefront of the global transformation.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2022-04-11T00:00:00-04:00</dateline>  <iso_dateline>2022-04-11T00:00:00-04:00</iso_dateline>  <gmt_dateline>2022-04-11 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[michelle.gowdy@gtri.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>(Interim) Director of Communications</p><p>Michelle Gowdy</p><p>Michelle.Gowdy@gtri.gatech.edu</p><p>404-407-8060</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>657149</item>          <item>657145</item>      </media>  <hg_media>          <item>          <nid>657149</nid>          <type>image</type>          <title><![CDATA[GTRI Researcher Comas Haynes]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Comas Haynes.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Comas%20Haynes.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Comas%20Haynes.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Comas%2520Haynes.jpg?itok=twgFCa9s]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1649688070</created>          <gmt_created>2022-04-11 14:41:10</gmt_created>          <changed>1649688070</changed>          <gmt_changed>2022-04-11 14:41:10</gmt_changed>      </item>          <item>          <nid>657145</nid>          <type>image</type>          <title><![CDATA[GTRI Research Engineer Konrad Ahlin]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[GTRI Research Engineer Konrad Ahlin.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/GTRI%20Research%20Engineer%20Konrad%20Ahlin.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/GTRI%20Research%20Engineer%20Konrad%20Ahlin.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/GTRI%2520Research%2520Engineer%2520Konrad%2520Ahlin.jpg?itok=lzRvtJID]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1649687732</created>          <gmt_created>2022-04-11 14:35:32</gmt_created>          <changed>1649687732</changed>          <gmt_changed>2022-04-11 14:35:32</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1276"><![CDATA[Georgia Tech Research Institute (GTRI)]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="42901"><![CDATA[Community]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="42901"><![CDATA[Community]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="365"><![CDATA[Research]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="166902"><![CDATA[science and technology]]></keyword>          <keyword tid="145251"><![CDATA[virtual reality]]></keyword>          <keyword tid="670"><![CDATA[atrp]]></keyword>          <keyword tid="13059"><![CDATA[Agricultural Technology Research Program]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="669"><![CDATA[agriculture]]></keyword>          <keyword tid="190338"><![CDATA[impacting the state]]></keyword>          <keyword tid="190339"><![CDATA[Georgia impact]]></keyword>          <keyword tid="57811"><![CDATA[food processing]]></keyword>          <keyword tid="1464"><![CDATA[Georgia Research Alliance]]></keyword>          <keyword tid="10677"><![CDATA[biosensing]]></keyword>          <keyword tid="668"><![CDATA[poultry]]></keyword>          <keyword tid="148381"><![CDATA[vr]]></keyword>          <keyword tid="342"><![CDATA[Georgia]]></keyword>          <keyword tid="57801"><![CDATA[poultry processing]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="651725">  <title><![CDATA[How to Make an Exosuit that Helps with Awkward Lifts]]></title>  <uid>35899</uid>  <body><![CDATA[<p>In the last few years, mechanically assistive exosuits, long depicted in works of popular science fiction and film, have finally started to see commercial deployment, according to <a href="https://www.me.gatech.edu/faculty/young">Aaron Young</a>, professor in the George W. Woodruff School of Mechanical Engineering at Georgia Tech. Most of these exosuits have a so-called passive design, assisting the wearer with unpowered elements like springs.&nbsp;</p><p>Active exosuits that incorporate electronics and powered motors are yet to be broadly applied. They tend to be big and heavy, and rely on rigid exoskeletons to transfer weight from body to ground. Exoskeletons add a great deal of stiffness, as well, Young said. Putting on most active exosuits is a little like becoming one with a forklift, restricting a wearer to lifting weights in a vertical plane.</p><p>For all these reasons, Young&rsquo;s Asymmetric Back eXosuit (ABX) described in the <a href="https://ieeexplore.ieee.org/document/9559874">October 5 issue of IEEE Transactions on Robotics</a> is highly non-standard. There&rsquo;s no exoskeleton, no rigid structure, nothing that makes contact with the floor. If the wearer is just standing there, it does nothing except for adding 14 pounds to their legs. But if they raise their body from a leaning over position, it makes a somewhat frantic noise: that is the sound of the ABX helping them rotate their torso, helping them twist.&nbsp;</p><p>Although most active exosuits support vertical lifts, rotating and twisting movements are also ubiquitous, especially in certain fields of manual labor like garbage collection and baggage handling. In many cases, these motions can be awkward and strenuous, leading to work-related injuries as well as back pain, according to Young. Back pain, in turn, is directly correlated with the strength of compressive forces and shear forces that are applied to the spine.</p><p>In designing their exosuit, the researchers sought a way to reduce these loads on the spinal joints. Putting it on looks a little like donning a futuristic backpack. Two motors are first strapped onto the back of each upper thigh. These motors are then connected to the back of the opposite shoulders, each with their own cable, making for two cables that diagonally overlap. The exosuit provides assistance by applying tension to the cables when it detects a wearer rise from a bending posture.</p><p>&ldquo;It&#39;s definitely a different sensation than a sort of standard exoskeleton. It&#39;s not your standard design,&rdquo; said Young.&nbsp;</p><p>Because the diagonal cables have a component of motion that is horizontal, they exert a pull on the torso that can aid in twisting it from side to side. In tests, the researchers showed that when a wearer of the ABX swung a weight from the ground to one side, the exosuit reduced their back muscle activations by an average of 16%, as measured by electromyography (EMG) sensors. The exosuit also provided a 37% reduction in back muscle exertion when a wearer lifted weights symmetrically, straight off the ground &ndash; an assistance level comparable to more rigid designs.&nbsp;</p><p>&ldquo;People definitely felt like the technology is assisting them, which is great. And we did see the concurrent EMG reduction,&rdquo; said Young. &ldquo;I think it&rsquo;s a great first step.&rdquo;</p><p>In a sense, wearing the exosuit is almost like strapping two additional muscles onto the body &ndash; unconventional muscles, which run directly from back to leg. Interestingly, it is the positioning of these muscles rather than their brute strength that makes them functional, said Young.</p><p>The motors pull the cables with much less power than the muscles in the body. However, the cables are positioned much further away from the joints. Through this positioning, the cables obtain greater leverage and mechanical advantage, allowing the wearer to reduce their overall muscular output and hence the load that they place on their spine. (Spinal loading was not directly measured in the study.)</p><p>Aside from its overall performance, it is the flexible, asymmetric nature of the suit that really makes it unique, Young said. There are currently no other active exosuits that provide assistance for twisting and rotating through a comparable range of motion. While other exosuits also use cables, none have arranged them along diagonal lines.</p><p>Young is currently seeking collaborations with industry partners to further develop the exosuit. In future work, he sees its control system as a point to improve. Currently, when a person raises their torso from a lowered position, the cables simply pull with constant tension. But it should be possible to make the system detect different actions of the wearer and adjust its pull in response.</p><p><strong>References</strong></p><p>J. M. Li, D. D. Molinaro, A. S. King, A. Mazumdar and A. J. Young, &quot;Design and Validation of a Cable-Driven Asymmetric Back Exosuit,&quot; in IEEE Transactions on Robotics, doi: 10.1109/TRO.2021.3112280.</p><p><strong>About Georgia Tech</strong></p><p>The Georgia Institute of Technology, or Georgia Tech, is a top 10 public research university developing leaders who advance technology and improve the human condition. The Institute offers business, computing, design, engineering, liberal arts, and sciences degrees. Its nearly 40,000 students representing 50 states and 149 countries, study at the main campus in Atlanta, at campuses in France and China, and through distance and online learning. As a leading technological university, Georgia Tech is an engine of economic development for Georgia, the Southeast, and the nation, conducting more than $1 billion in research annually for government, industry, and society.</p>]]></body>  <author>Mordechai Rorvig</author>  <status>1</status>  <created>1634240968</created>  <gmt_created>2021-10-14 19:49:28</gmt_created>  <changed>1634317540</changed>  <gmt_changed>2021-10-15 17:05:40</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[New exosuit invented by Georgia Tech researchers reduces muscular exertion required for rotating and twisting motions.]]></teaser>  <type>news</type>  <sentence><![CDATA[New exosuit invented by Georgia Tech researchers reduces muscular exertion required for rotating and twisting motions.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2021-10-14T00:00:00-04:00</dateline>  <iso_dateline>2021-10-14T00:00:00-04:00</iso_dateline>  <gmt_dateline>2021-10-14 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[mrorvig@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Mordechai Rorvig<br />Senior Science Writer<br />Georgia Institute of Technology</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>651722</item>      </media>  <hg_media>          <item>          <nid>651722</nid>          <type>image</type>          <title><![CDATA[Aaron Young 001]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[BexoStill_padded.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/BexoStill_padded.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/BexoStill_padded.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/BexoStill_padded.jpg?itok=QDVMnjHj]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1634240470</created>          <gmt_created>2021-10-14 19:41:10</gmt_created>          <changed>1634317475</changed>          <gmt_changed>2021-10-15 17:04:35</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39461"><![CDATA[Manufacturing, Trade, and Logistics]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="651600">  <title><![CDATA[Biomolecular Engineer Wins Grant to Make Microorganism-Inspired Machines]]></title>  <uid>35899</uid>  <body><![CDATA[<p>What do the cylinder in an internal combustion engine and the four-millimeter-long creature, <em>Spirostomum ambiguum</em>, have in common? Surprisingly, quite a bit. Both are similarly shaped. Both shrink to a fraction of their size in an instant. And both release about the same amount of power output per cubic centimeter in volume. But for all we know about the engine, we know relatively little about the living organism.</p><p>Saad Bhamla, a professor in the School of Chemical and Biomolecular Engineering at Georgia Tech, recently received an <a href="https://reporter.nih.gov/search/Oknss65S00GZ6zWXumkEyw/project-details/10273361">Outstanding Investigator Award</a> from the National Institute of General Medical Sciences, part of the National Institutes of Health, to continue&nbsp;studying&nbsp;<em>Spirostomum</em> and attempt to build machines based on similar principles. The grant will provide his research group with $1.98 million in funding over five years.&nbsp;</p><p>For Bhamla, the comparison between the organism and the engine is more than just an analogy. He is now working to build something directly akin to a micro-engine, with pistons and cylinders made out of synthetic cells similar to <em>Spirostomum</em>.&nbsp;</p><p>&ldquo;That&#39;s basically the stuff of my dreams,&rdquo; Bhamla said.&nbsp;</p><p>Once built, he believes that these molecular engines might prove far more efficient than other miniaturized power sources. The chief difficulty will be making a synthetic cell that functions like <em>Spirostomum</em>, Bhamla said. Today, most synthetic cells do very different things, like&nbsp;producing lab-grown meat.&nbsp;</p><p>&ldquo;We still think of them as basically bags of fluid,&rdquo; said Bhamla. &ldquo;They don&#39;t move, they just hang around in test tubes.&rdquo;</p><p>Over the last few years, Bhamla and colleagues have learned more about how <em>Spirostomum</em> works. Its capabilities come from its use of an unconventional fuel, calcium, rather than adenosine triphosphate (ATP), the molecule that powers most human cells.</p><p>In a <a href="https://www.biorxiv.org/content/10.1101/854836v1.full">preprint</a> from 2019, Bhamla and Xinjing Xu, then an undergraduate student at Georgia Tech, figured out exactly what makes the organism contract. They found that when calcium binds to <em>Spirostomum&rsquo;s</em> skeletal mesh, it forces each cell of the skeleton to coil tight.</p><p>One of Bhamla&rsquo;s current doctoral students, Xiangting Lei, is already examining how to replicate this mechanism in a synthetic cell. She is investigating how to give the cell external triggers so that engineers can make it contract whenever they want. Bhamla plans to use the funds from the grant to hire several more graduate students to study these systems.</p><p>The goal is to create modern versions of what were historically known as mechanochemical&nbsp;machines. A rich literature on these chemically-powered machines had been created in the sixties, only to be forgotten, Bhamla said. It seemed to be a classic case of science getting ahead of itself.&nbsp;</p><p>&ldquo;They didn&#39;t have the right optical tools and soft materials to do this,&rdquo; said Bhamla. &ldquo;This is a great time to revisit [the research] because I think this time, we might be able to have much more success.&rdquo;&nbsp;</p><p><strong>About Georgia Tech</strong></p><p>The Georgia Institute of Technology, or Georgia Tech, is a top 10 public research university developing leaders who advance technology and improve the human condition. The Institute offers business, computing, design, engineering, liberal arts, and sciences degrees. Its nearly 40,000 students representing 50 states and 149 countries, study at the main campus in Atlanta, at campuses in France and China, and through distance and online learning. As a leading technological university, Georgia Tech is an engine of economic development for Georgia, the Southeast, and the nation, conducting more than $1 billion in research annually for government, industry, and society.</p>]]></body>  <author>Mordechai Rorvig</author>  <status>1</status>  <created>1634046551</created>  <gmt_created>2021-10-12 13:49:11</gmt_created>  <changed>1634133951</changed>  <gmt_changed>2021-10-13 14:05:51</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[To make a micro-robot that moves, look to what nature does, first.]]></teaser>  <type>news</type>  <sentence><![CDATA[To make a micro-robot that moves, look to what nature does, first.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2021-10-12T00:00:00-04:00</dateline>  <iso_dateline>2021-10-12T00:00:00-04:00</iso_dateline>  <gmt_dateline>2021-10-12 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[mrorvig@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Mordechai Rorvig<br />Senior Science Writer<br />Georgia Institute of Technology</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>651598</item>      </media>  <hg_media>          <item>          <nid>651598</nid>          <type>image</type>          <title><![CDATA[Saad Bhamla 001]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[DSC_3036.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/DSC_3036.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/DSC_3036.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/DSC_3036.jpg?itok=46Y2srfx]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1634045637</created>          <gmt_created>2021-10-12 13:33:57</gmt_created>          <changed>1634045637</changed>          <gmt_changed>2021-10-12 13:33:57</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="141"><![CDATA[Chemistry and Chemical Engineering]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="141"><![CDATA[Chemistry and Chemical Engineering]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="650598">  <title><![CDATA[A Glimpse into the Peach Orchard of the Future]]></title>  <uid>35832</uid>  <body><![CDATA[<p>Peaches, not surprisingly, pack a punch for Georgia&#39;s economy.</p><p>Over 130 million pounds of peaches are produced in Georgia per year, and the Southern staple has a total farm gate value of over $71 million, according to recent <a href="https://extension.uga.edu/topic-areas/fruit-vegetable-ornamentals-production/peaches.html">estimates</a>.</p><p>But cultivating peaches is a complex and manually-intensive process that has put a strain on many farms stretched for time and workers. To solve this problem, the Georgia Tech Research Institute (GTRI) has developed an intelligent robot that is designed to handle the human-based tasks of thinning and pruning peach trees, which could result in significant cost savings for peach farms in Georgia. &nbsp;&nbsp;</p><p>&quot;Most folks are familiar with the harvesting of fruit and picking it up at the market,&quot; said Ai-Ping Hu, a GTRI senior research engineer who is leading the robot design project. &quot;But there&#39;s actually a lot more stuff that gets done before that point in the cultivation cycle.&quot;</p><p>By using a LIDAR remote sensing system &ndash; which determines distances by targeting an object with a laser and measuring the amount of time it takes for the laser beam to reflect back &ndash; and a highly-specialized GPS technology that measures locations as specific as a fraction of an inch, the robot is able to self-navigate through peach orchards while steering clear of obstacles. Once at a peach tree, the robot uses an embedded 3D camera to determine which peaches need to be removed<s>,</s> and removes the peaches using a claw-like device, known as an end effector, that is connected to the end of its arm.&nbsp; &nbsp;</p><p>The robot specifically addresses two key components of the peach cultivation cycle: tree pruning and tree thinning.&nbsp;</p><p>Tree pruning refers to the selective removal of branches prior to the spring growing season, which typically occurs from mid-May to early August, and serves many purposes &ndash; including exposing more interior surface areas of the fruit trees to sunlight and removing undesired older growth to enable new growth to better thrive.</p><p>Tree thinning, meanwhile, is when small or undeveloped peaches, known as peachlets, are removed from peach trees to allow for bigger and better peaches to grow, Hu explained.&nbsp;</p><p>&quot;If you just let all the peaches grow to maturity, then what you&#39;ll end up getting is a tree of really small peaches,&quot; Hu said. &quot;What you want to do is have relatively few peaches, but you want the ones that remain to be nice and big and sweet &ndash; ones you can actually sell.&quot;</p><p>But so far, there are no robots on the market that have been able to fully replace humans in the peach cultivation due to peach orchards&#39; unstructured environments, which includes unpredictable weather, uneven terrain, and trees&#39; different shapes and sizes, Hu noted.</p><p>&quot;In an orchard, no two trees are ever the same,&quot; Hu added. &quot;You could have a sunny day or a really cloudy day &ndash; that&#39;s going to affect the way the technology on the robot can operate.&quot;</p><p>&quot;There&#39;s no robot in the world right now that can harvest or thin peaches as well as people can,&quot; Hu said. &quot;The technology&#39;s not quite there yet.&quot;</p><p>Current efforts to automate the harvesting of peaches and other specialty crops so far have not been as successful as advancements in commodity crop automation, where machines can collect hundreds of acres of the good at a time. Commodity crops include items such as corn, wheat, and soybeans.&nbsp;</p><p>&quot;Specialty crops are still very reliant on manual labor,&quot; Hu said. &quot;That&#39;s really different from something like wheat, where one person driving a combine can harvest thousands of acres, hundreds of acres. Whereas for [peach harvesting], because everything is so individualized so unique, it&#39;s really been difficult to automate.&quot;</p><p>To address these unique issues, GTRI is exploring ways to incorporate artificial intelligence and deep learning training methods to improve the robot&#39;s image classification abilities and overall performance. GTRI has also partnered with Dario Chavez, an associate professor in the Department of Horticulture at the University of Georgia Griffin Campus in Griffin, Ga., to further explore the intelligent automation of peach farming.</p><p>Gary McMurray, a GTRI principal research engineer and division chief of GTRI&#39;s Intelligent Sustainable Technologies Division, said the novel robot stands to transform the fruit cultivation process for many farms that have struggled to grow trees that are strong enough to withstand unpredictable environmental conditions.&nbsp;</p><p>&quot;This is something that directly affects the yield of the trees,&quot; McMurray said. &quot;It&#39;s money in the pocket of the growers.&quot;</p><p>&nbsp;</p><p><em>Writer: Anna Akins </em></p><p><em>Photo Credit: Ai-Ping Hu </em></p>]]></body>  <author>Michelle Gowdy</author>  <status>1</status>  <created>1631196223</created>  <gmt_created>2021-09-09 14:03:43</gmt_created>  <changed>1631721495</changed>  <gmt_changed>2021-09-15 15:58:15</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The Georgia Tech Research Institute (GTRI) is developing a robot that utilizes deep learning to automate certain aspects of the peach cultivation process, which could be a boon for many Georgia peach farmsgrappling with a shortage of workers.]]></teaser>  <type>news</type>  <sentence><![CDATA[The Georgia Tech Research Institute (GTRI) is developing a robot that utilizes deep learning to automate certain aspects of the peach cultivation process, which could be a boon for many Georgia peach farmsgrappling with a shortage of workers.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2021-09-14T00:00:00-04:00</dateline>  <iso_dateline>2021-09-14T00:00:00-04:00</iso_dateline>  <gmt_dateline>2021-09-14 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[michelle.gowdy@gtri.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>(Interim) Director of Communications</p><p>Michelle Gowdy</p><p>Michelle.Gowdy@gtri.gatech.edu</p><p>404-407-8060</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>650597</item>          <item>650594</item>      </media>  <hg_media>          <item>          <nid>650597</nid>          <type>image</type>          <title><![CDATA[GTRI's Peachy Robot Utilizes Deep Learning to Automate Certain Aspects of the Peach Cultivation Process]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Peach robot pic 1.JPG]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Peach%20robot%20pic%201.JPG]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Peach%20robot%20pic%201.JPG]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Peach%2520robot%2520pic%25201.JPG?itok=W-Io5ucQ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1631195426</created>          <gmt_created>2021-09-09 13:50:26</gmt_created>          <changed>1631195426</changed>          <gmt_changed>2021-09-09 13:50:26</gmt_changed>      </item>          <item>          <nid>650594</nid>          <type>image</type>          <title><![CDATA[GTRI's Peachy Robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Peach robot pic 5 .jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Peach%20robot%20pic%205%20.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Peach%20robot%20pic%205%20.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Peach%2520robot%2520pic%25205%2520.jpg?itok=ov0aatJj]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1631191929</created>          <gmt_created>2021-09-09 12:52:09</gmt_created>          <changed>1631191929</changed>          <gmt_changed>2021-09-09 12:52:09</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1276"><![CDATA[Georgia Tech Research Institute (GTRI)]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="1317"><![CDATA[News Briefs]]></group>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="42901"><![CDATA[Community]]></category>          <category tid="131"><![CDATA[Economic Development and Policy]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="42901"><![CDATA[Community]]></term>          <term tid="131"><![CDATA[Economic Development and Policy]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="365"><![CDATA[Research]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="166902"><![CDATA[science and technology]]></keyword>          <keyword tid="290"><![CDATA[Economy]]></keyword>          <keyword tid="171151"><![CDATA[State of Georgia]]></keyword>          <keyword tid="109581"><![CDATA[deep learning]]></keyword>          <keyword tid="6503"><![CDATA[automation]]></keyword>          <keyword tid="188822"><![CDATA[peach robot]]></keyword>          <keyword tid="188823"><![CDATA[peach farm]]></keyword>          <keyword tid="111431"><![CDATA[lidar]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="166890"><![CDATA[sustainability]]></keyword>          <keyword tid="1033"><![CDATA[Economic Impact]]></keyword>          <keyword tid="23681"><![CDATA[Food Processing Technology]]></keyword>          <keyword tid="188834"><![CDATA[agritech]]></keyword>          <keyword tid="669"><![CDATA[agriculture]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="650214">  <title><![CDATA[The Mechanics of Pellet-Carrying Honey Bees]]></title>  <uid>27560</uid>  <body><![CDATA[<p>New research led by Georgia Tech&rsquo;s College of Engineering finds that honey bees have developed a way to transform pollen particles into a viscoelastic pellet, allowing them to transport pollen efficiently, quickly, and reliably to their hive.&nbsp;The study also suggests the insects remove pollen from their bodies at speeds 2-10 times slower than their typical grooming speeds.</p><p>To collect and transport pollen, honey bees mix pollen particles with regurgitated nectar and form it into a pellet, which clings to each of their hind legs. The honey bees then deposit the pellets into a cell within the hive by carefully scraping them off using their other legs.&nbsp;</p><p>The study, from the lab of&nbsp;<a href="https://www.me.gatech.edu/" rel="noreferrer" target="_blank">George W. Woodruff School of Mechanical Engineering</a>&nbsp;Professor&nbsp;<a href="https://www.me.gatech.edu/faculty/hu" rel="noreferrer" target="_blank">David Hu</a>, sought to better understand the mechanics of this process which could inspire new ways to manufacture and manipulate soft materials. Hu holds a joint appointment in the&nbsp;<a href="https://biosciences.gatech.edu/" rel="noreferrer" target="_blank">School of Biological Sciences</a>.</p><p>The paper, &ldquo;<a href="https://royalsocietypublishing.org/doi/abs/10.1098/rsif.2021.0549?af=R" rel="noreferrer" target="_blank">Biomechanics of Pollen Removal By the Honey Bee</a>,&rdquo; is published in the Journal of the Royal Society Interface.&nbsp;<br /><br />&ldquo;We measured the viscoelastic material properties of a pollen pellet,&rdquo; said Marguerite Matherne, a recent Georgia Tech mechanical engineering Ph.D. graduate who now teaches at Northeastern University. &ldquo;We found that the pellets have a really long relaxation time, which means they remain mostly in a solid form during the transport process. This is good because it keeps the pellet from melting or falling apart from vibration during flight.&rdquo;</p><p>Matherne and the Georgia Tech research team also tried to replicate how honey bees remove the pellets from their hind legs in the lab. They built a device that scraped adhered pollen pellets from bee legs. The invention produced two discoveries. The first was that the honey bees were much more efficient in removing the pellet than the scraping device they built (the device left much more pollen residue on the leg). They also found that slower removal speeds reduce the force and work required to remove pellets under shear stress.&nbsp;</p><p>&ldquo;If you remove it slowly, you can avoid applying the excessive force required to remove it quickly,&rdquo; said Hu, Matherne&rsquo;s former Georgia Tech advisor. &ldquo;Removing a pollen pellet is like the opposite of ripping off a Band-Aid.&rdquo;<br /><br />Matherne said that there are two key components to the efficiency of the honey bees transporting these pellets. First, the pellets are gooey, allowing them to stick to the hind legs. But, she said, the bees also have a special structure on their legs called the corbicula. It&rsquo;s fringed with long, curved hairs and becomes embedded into the pellet, allowing for adhesion.</p><p>In addition, honey bees can collect pollen particles in various shapes and sizes, while also developing a way to transport them. This is different from other species of bees, which only collect and carry specific types of pollen that are similar in size. They also use different transport techniques.</p><p>&ldquo;Honey bees collect from flowers miles and miles away,&rdquo; said Hu. &ldquo;The pollen can change in size by a factor of 10. They must collect all these individual particles and bring it back to one place. And they must do a dozen foraging trips each day, all while keeping their bodies clean. They solve it all by this special method they created to exploit the pellet&rsquo;s soft material properties.&rdquo;</p><p>The research team believes further studies could lead to new developments in medical patches or fastener applications for soft materials.</p><p>&ldquo;It&rsquo;s kind of like smart gooey Velcro for soft materials,&rdquo; said Hu. &ldquo;It could be a fastener and it knows when you&rsquo;re trying to remove it so that you don&rsquo;t have to use an excessive amount of force.&rdquo;</p><p>Matherne suggests that it&rsquo;s also important to understand the pollinating process since 35% of the world&rsquo;s crop production depends on pollinators.</p><p>&ldquo;Honey bees are really important pollinators,&rdquo; said Matherne. &ldquo;If we want to create a world where we can keep up our pollinators, I think it&rsquo;s important to understand exactly what they&rsquo;re doing.&rdquo;</p><p>CITATION: Matherne, M., et.al., &quot;Biomechanics of pollen pellet removal by the honey bee.&quot; (Journal of the Royal Society Interface)&nbsp;<a href="https://doi.org/10.1098/rsif.2021.0549" rel="noreferrer" target="_blank">https://doi.org/10.1098/rsif.2021.0549</a></p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1630340222</created>  <gmt_created>2021-08-30 16:17:02</gmt_created>  <changed>1630378025</changed>  <gmt_changed>2021-08-31 02:47:05</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Honey bees have developed a way to transform pollen particles into a viscoelastic pellet.]]></teaser>  <type>news</type>  <sentence><![CDATA[Honey bees have developed a way to transform pollen particles into a viscoelastic pellet.]]></sentence>  <summary><![CDATA[<p>New research led by Georgia Tech&rsquo;s College of Engineering finds that honey bees have developed a way to transform pollen particles into a viscoelastic pellet, allowing them to transport pollen efficiently, quickly, and reliably to their hive.&nbsp;The study also suggests the insects remove pollen from their bodies at speeds 2-10 times slower than their typical grooming speeds.</p>]]></summary>  <dateline>2021-08-30T00:00:00-04:00</dateline>  <iso_dateline>2021-08-30T00:00:00-04:00</iso_dateline>  <gmt_dateline>2021-08-30 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Understanding how honey bees transport pollen pellets to their hive may inspire new ways to manufacture and manipulate soft materials]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[candler.hobbs@coe.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Candler Hobbs<br />College of Enigneering<br />candler.hobbs@coe.gatech.edu</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>650215</item>      </media>  <hg_media>          <item>          <nid>650215</nid>          <type>image</type>          <title><![CDATA[Honey Bee Pollen Pellet]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[1024px-Godvor.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/1024px-Godvor.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/1024px-Godvor.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/1024px-Godvor.jpeg?itok=8p9VpnjC]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Honey bee on flower]]></image_alt>                    <created>1630340340</created>          <gmt_created>2021-08-30 16:19:00</gmt_created>          <changed>1630340340</changed>          <gmt_changed>2021-08-30 16:19:00</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.news.gatech.edu/news/2017/03/28/hair-spacing-keeps-honeybees-clean-during-pollination]]></url>        <title><![CDATA[Hair Spacing Keeps Honeybees Clean During Pollination]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1237"><![CDATA[College of Engineering]]></group>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1275"><![CDATA[School of Biological Sciences]]></group>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="8862"><![CDATA[Student Research]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="8862"><![CDATA[Student Research]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="167936"><![CDATA[Soft materials]]></keyword>          <keyword tid="215"><![CDATA[manufacturing]]></keyword>          <keyword tid="20121"><![CDATA[biologically inspired design]]></keyword>          <keyword tid="166882"><![CDATA[School of Biological Sciences]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39471"><![CDATA[Materials]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="650136">  <title><![CDATA[Shreyes Melkote Appointed Novelis Innovation Hub Executive Director at Georgia Tech ]]></title>  <uid>34602</uid>  <body><![CDATA[<p>Georgia Institute of Technology and Novelis, Inc., the world leader in aluminum rolling and recycling, announced that Shreyes Melkote will serve as the new executive director of the Novelis Innovation Hub at Georgia Tech.</p><p>As Melkote assumes his appointment, Georgia Tech commends George W. Woodruff School of Mechanical Engineering Regents Professor Surya Kalidindi&rsquo;s service as the inaugural interim executive director during the Novelis Innovation Hub&rsquo;s first two years.</p><p>Since its establishment in 2019, the Novelis Innovation Hub has set a bold vision to foster world-class partnerships and collaborated with the Institute on battery research, electronics, robotics, high-throughput research, and additive manufacturing.</p><p><strong>Advancing Mobility and Sustainability Goals</strong></p><p>With additional investment and a permanent leadership appointment to guide the Innovation Hub, Novelis hopes to further advance its position in the aluminum industry through innovation in new technology and application domains, including sustainable mobility, electronics, advanced manufacturing, and supply chain.</p><p>&ldquo;Sustainability is an important element of what Novelis wants to accomplish,&rdquo; said Melkote, noting Novelis&rsquo;s target to reduce its carbon footprint by 30% by 2026 and to be net carbon neutral by 2050. &ldquo;Georgia Tech is focused on a lot of basic science, technologies, and business practices relevant to enabling a more sustainable enterprise.&rdquo;</p><p>Melkote is uniquely qualified for the role, having led the Georgia Tech-Boeing Strategic University Partnership for the last eight years while serving as associate director of <a href="http://research.gatech.edu/manufacturing">Georgia Tech Manufacturing Institute (GTMI)</a>. He facilitated the establishment of the Boeing Manufacturing Development Center, an on-campus lab where students and faculty regularly collaborate with a resident Boeing engineer.</p><p>&ldquo;I see this as an opportunity to leverage my experience and knowledge from the Boeing partnership and to expand it. Novelis is engaged in the entire lifecycle of innovation, from early-stage basic research, to applied research and commercialization that will impact society at large,&rdquo; said Melkote, who also holds the Morris M. Bryan, Jr. Professorship in Mechanical Engineering at Georgia Tech. He will work closely with Dr. Raj Gopalaswamy, Novelis&rsquo; global technology director for new domains, who will lead Novelis&rsquo; engagement with Georgia Tech.</p><p>&ldquo;To keep advancing the aluminum industry toward the circular economy, we must increase the pace of innovation and develop new solutions that demonstrate aluminum&rsquo;s superior sustainability benefits,&rdquo; said Gopalaswamy. &nbsp;&ldquo;Through research partnerships with world-leading institutions like Georgia Tech, we can fulfill the growing needs for aluminum applications that help our customers meet their sustainability goals faster and more efficiently.&rdquo;</p><p>Melkote agreed, adding, &ldquo;What&rsquo;s exciting is that &nbsp;Novelis wants to look at the cutting edge of research and see how they can leverage that knowledge to innovate and develop new products.&rdquo;</p><p>&ldquo;We&rsquo;re thrilled to have Professor Melkote take on this leadership position in our growing collaboration with Novelis,&rdquo; said Julia Kubanek, vice president for Interdisciplinary Research at Georgia Tech. &ldquo;He brings substantial experience to this new role, having built Georgia Tech&rsquo;s partnership with Boeing and served as associate director of the Georgia Tech Manufacturing Institute for several years.&rdquo;</p><p>Kubanek added that Melkote is well positioned to help Novelis broaden its relationship with Georgia Tech faculty and students, while engaging in key research areas to accelerate Novelis&rsquo;s product innovation. Additionally, the Innovation Hub intends to not only fund research, but also establish a Scholars Program to fund research fellowships for Georgia Tech graduate and undergraduate students.</p><p>&ldquo;Novelis&rsquo;s philanthropy commitment allows us to innovate on the educational front, where we can make investments that benefit both Georgia Tech and our educational mission,&rdquo; said Melkote. &ldquo;In doing so, we help train the next generation of engineers who will go on to work for companies like Novelis that are committed to sustainability.&rdquo;</p><p><em>***</em></p><p><strong>About Georgia Tech </strong></p><p>The Georgia Institute of Technology, or Georgia Tech, is a top 10 public research university developing leaders who advance technology and improve the human condition. The Institute offers business, computing, design, engineering, liberal arts, and sciences degrees. Its nearly 40,000 students representing 50 states and 149 countries, study at the main campus in Atlanta, at campuses in France and China, and through distance and online learning. As a leading technological university, Georgia Tech is an engine of economic development for Georgia, the Southeast, and the nation, conducting more than $1 billion in research annually for government, industry, and society.</p>]]></body>  <author>Georgia Parmelee</author>  <status>1</status>  <created>1630001655</created>  <gmt_created>2021-08-26 18:14:15</gmt_created>  <changed>1630001655</changed>  <gmt_changed>2021-08-26 18:14:15</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Melkote to help Novelis achieve sustainability, mobility, and future workforce goals ]]></teaser>  <type>news</type>  <sentence><![CDATA[Melkote to help Novelis achieve sustainability, mobility, and future workforce goals ]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2021-08-26T00:00:00-04:00</dateline>  <iso_dateline>2021-08-26T00:00:00-04:00</iso_dateline>  <gmt_dateline>2021-08-26 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[asargent7@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Writer</strong>: Anne Wainscott-Sargent</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>650134</item>      </media>  <hg_media>          <item>          <nid>650134</nid>          <type>image</type>          <title><![CDATA[Melkote headshot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Screen Shot 2021-08-26 at 2.07.22 PM.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Screen%20Shot%202021-08-26%20at%202.07.22%20PM.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Screen%20Shot%202021-08-26%20at%202.07.22%20PM.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Screen%2520Shot%25202021-08-26%2520at%25202.07.22%2520PM.png?itok=cga756UW]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Shreyes Melkote headshot]]></image_alt>                    <created>1630001280</created>          <gmt_created>2021-08-26 18:08:00</gmt_created>          <changed>1630001280</changed>          <gmt_changed>2021-08-26 18:08:00</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="155831"><![CDATA[Georgia Tech Manufacturing Institute (GTMI)]]></group>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="144"><![CDATA[Energy]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="144"><![CDATA[Energy]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="186857"><![CDATA[go-gtmi]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="649344">  <title><![CDATA[Rivera-Hernández Wins NASA Grant to Aid Current Mars Rover Missions — and Find ‘Martian Lakes’ for Future Rovers and Crews]]></title>  <uid>34434</uid>  <body><![CDATA[<p>There&rsquo;s a good reason why the&nbsp;<a href="https://mars.nasa.gov/mars2020/">Mars 2020 Mission Perseverance Rover</a>&nbsp;and its mini-copter counterpart&nbsp;<a href="https://mars.nasa.gov/technology/helicopter/">Ingenuity</a>&nbsp;are currently busy exploring the edges of the Jezero Crater on the Red Planet. Water once flowed freely there, as it did eons ago at similar sites on Earth &mdash; and perhaps with it, water-deposited evidence of life deep beneath Jezero&rsquo;s rust-colored boulders and sand.</p><p>Those so-called terrestrial analog sites on Earth helped NASA choose Jezero for the mission. &ldquo;Ancient lake beds are a major target for Mars exploration, as they provide evidence for sustained liquid water in Mars&rsquo; past &mdash; and lake muds commonly preserve biosignatures on Earth<em>,&rdquo;&nbsp;</em>says&nbsp;<a href="https://eas.gatech.edu/people/rivera-hernandez-dr-frances">Frances Rivera-Hern&aacute;ndez</a>, assistant professor in the&nbsp;<a href="https://eas.gatech.edu/">School of Earth and Atmospheric Sciences</a>. &ldquo;Thus, if life ever persisted on early Mars, their past presence may be preserved in ancient lake beds.&rdquo;</p><p>Rivera-Hern&aacute;ndez, who joined Georgia Tech in January, will soon get a chance to study another analog site in the Antarctic, thanks to a four-year $700,000 NASA grant awarded to her research proposal, &ldquo;Paleolake deposits in Miers Valley, Antarctica: An analog depositional record for Martian lakes through late Noachian to early Hesperian climatic transitions.&rdquo;</p><p>Just like the drilling and sampling now going on at Jezero Crater on Mars, Rivera-Hern&aacute;ndez&rsquo;s work may help NASA choose future Mars destinations for both robotic rover and crewed missions. That&rsquo;s because Rivera-Hern&aacute;ndez is also a collaborating scientist on NASA&rsquo;s&nbsp;<a href="https://mars.nasa.gov/msl/home/">Curiosity Rover</a>&nbsp;mission. &ldquo;Lessons learned through the Antarctic project will help inform my work on the mission, as we have been characterizing lake bed deposits with the Rover,&rdquo; she says. Since landing on Mars in 2012, Curiosity has traveled nearly 26 km (16<em>.</em>14 miles) around the rim of Gale Crater, another probable dry lake.</p><p>&ldquo;I was ecstatic to hear that my grant was funded, and excited to be heading to Antarctica for field work,&rdquo; says Rivera-Hern&aacute;ndez, who will serve as the study&rsquo;s principal investigator. Her co-investigator is Tyler Mackey, an assistant professor at the University of New Mexico. The grant will also provide funding for two graduate students, one from each institution. Field work is planned to start in January 2024.</p><p>&ldquo;Before the field season, we will be performing remote sensing observations of our field site and performing lab-based analyses on modern lake samples to plan for the field work studying ancient lake beds,&rdquo; Rivera-Hern&aacute;ndez says.&nbsp;</p><p><a href="https://planetas.eas.gatech.edu/">Her lab team</a>&nbsp;will study the deposits of a large Antarctic lake that persisted through climate changes 10,000 to 20,000 years ago to better recognize those similar changes in ancient lake beds on Mars, like those being explored by Curiosity and Perseverance.&nbsp;</p><p>&ldquo;Currently, liquid water is not stable on the surface of Mars, but we have abundant geologic evidence for the presence of lakes on early Mars, suggesting that Mars&rsquo; climate was different in the past and that it changed through time,&rdquo; Rivera-Hern&aacute;ndez says. &ldquo;But we still do not have a good understanding on whether this climatic transition was abrupt or gradual, or if Mars was significantly warmer when the lakes were present.&rdquo;</p><p>That&rsquo;s an unknown because lakes can form in a variety of climates, she adds. Examples are found in polar regions on Earth, where liquid water exists in lakes with permanent ice covers. &ldquo;However, when ice is present in a lake, there are processes that are unique, and sometimes these produce deposits that may be recorded in lake beds. Thus, past climate may be inferred from lake beds if these unique deposits are recognized and distinguished from other deposit types.&rdquo;</p><p>Rivera-Hernadez&rsquo;s project will also help scientists recognize these unique deposits in ancient lake beds on Mars &mdash; by studying the deposits of that ancient Antarctic lake which experienced periods with and without an ice cover, due to those climatic changes on Earth.</p>]]></body>  <author>Renay San Miguel</author>  <status>1</status>  <created>1628522168</created>  <gmt_created>2021-08-09 15:16:08</gmt_created>  <changed>1628794546</changed>  <gmt_changed>2021-08-12 18:55:46</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Frances Rivera-Hernández and her team will soon head to Antarctica to study an ancient lake bed that may aid in search for past life on Mars, plus clues to climatic changes]]></teaser>  <type>news</type>  <sentence><![CDATA[Frances Rivera-Hernández and her team will soon head to Antarctica to study an ancient lake bed that may aid in search for past life on Mars, plus clues to climatic changes]]></sentence>  <summary><![CDATA[<p>School of Earth and Atmospheric Sciences assistant professor Frances Rivera-Hern&aacute;ndez will receive $700,000 over the next four years to study an ancient lake bed in Antarctica &mdash; with the hope&nbsp;of using samples and data to&nbsp;help NASA determine future landing sites for Mars missions.&nbsp;</p>]]></summary>  <dateline>2021-08-12T00:00:00-04:00</dateline>  <iso_dateline>2021-08-12T00:00:00-04:00</iso_dateline>  <gmt_dateline>2021-08-12 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Frances Rivera-Hernández and her team will soon head to Antarctica to study an ancient lake bed that may aid in search for past life on Mars, plus clues to climatic changes]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[renay.san@cos.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Renay San Miguel<br />Communications Officer II/Science Writer<br />College of Sciences<br />404-894-5209</p><p>&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>649339</item>          <item>649340</item>          <item>649341</item>          <item>649342</item>      </media>  <hg_media>          <item>          <nid>649339</nid>          <type>image</type>          <title><![CDATA[Frances Rivera-Hernández taking field samples in Antarctica in 2015 (Photo Frances Rivera-Hernandez)]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Rivera-Hernandez in Antarctica 2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Rivera-Hernandez%20in%20Antarctica%202.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Rivera-Hernandez%20in%20Antarctica%202.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Rivera-Hernandez%2520in%2520Antarctica%25202.jpg?itok=qGnRj1an]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1628518718</created>          <gmt_created>2021-08-09 14:18:38</gmt_created>          <changed>1628793789</changed>          <gmt_changed>2021-08-12 18:43:09</gmt_changed>      </item>          <item>          <nid>649340</nid>          <type>image</type>          <title><![CDATA[Miers Valley in Antarctica (Photo Pierre Roudier/Wikimedia)]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Miers Valley Antarctica Photo Pierre Roudier Wikimedia.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Miers%20Valley%20Antarctica%20Photo%20Pierre%20Roudier%20Wikimedia.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Miers%20Valley%20Antarctica%20Photo%20Pierre%20Roudier%20Wikimedia.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Miers%2520Valley%2520Antarctica%2520Photo%2520Pierre%2520Roudier%2520Wikimedia.jpg?itok=4l7uM4nM]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1628518865</created>          <gmt_created>2021-08-09 14:21:05</gmt_created>          <changed>1628518865</changed>          <gmt_changed>2021-08-09 14:21:05</gmt_changed>      </item>          <item>          <nid>649341</nid>          <type>image</type>          <title><![CDATA[Frances Rivera-Hernández]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Frances Rivera-Hernandez.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Frances%20Rivera-Hernandez.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Frances%20Rivera-Hernandez.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Frances%2520Rivera-Hernandez.png?itok=9kwfW83S]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1628519088</created>          <gmt_created>2021-08-09 14:24:48</gmt_created>          <changed>1628793993</changed>          <gmt_changed>2021-08-12 18:46:33</gmt_changed>      </item>          <item>          <nid>649342</nid>          <type>image</type>          <title><![CDATA[Curiosity Rover "selfie" at Mont Mercou, Mars (Photo NASA)]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Curiosity Rover %22selfie%22 at Mont Mercou, Mars (Photo NASA).png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Curiosity%20Rover%20%2522selfie%2522%20at%20Mont%20Mercou%2C%20Mars%20%28Photo%20NASA%29.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Curiosity%20Rover%20%2522selfie%2522%20at%20Mont%20Mercou%2C%20Mars%20%28Photo%20NASA%29.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Curiosity%2520Rover%2520%252522selfie%252522%2520at%2520Mont%2520Mercou%252C%2520Mars%2520%2528Photo%2520NASA%2529.png?itok=NCkhgP6G]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1628519690</created>          <gmt_created>2021-08-09 14:34:50</gmt_created>          <changed>1628519690</changed>          <gmt_changed>2021-08-09 14:34:50</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://planetas.eas.gatech.edu]]></url>        <title><![CDATA[Georgia Tech Planetary Laboratory Analyzing Environments, Terrains, and Analogs]]></title>      </link>          <link>        <url><![CDATA[https://coe.gatech.edu/news/2021/02/space-science-week-tech-progress-and-perseverance]]></url>        <title><![CDATA[Space Science Week at Tech: Progress and Perseverance]]></title>      </link>          <link>        <url><![CDATA[https://www.scientificamerican.com/article/summer-on-mars-nasas-perseverance-rover-is-one-of-three-missions-ready-to-launch/]]></url>        <title><![CDATA[Summer on Mars: NASA’s Perseverance Rover Is One of Three Missions Ready to Launch]]></title>      </link>          <link>        <url><![CDATA[https://scitechdaily.com/clues-to-chilly-ancient-mars-buried-in-rocks-discovered-by-nasas-curiosity-rover/]]></url>        <title><![CDATA[Clues to Chilly Ancient Mars Buried in Rocks Discovered by NASA’s Curiosity Rover]]></title>      </link>          <link>        <url><![CDATA[https://www.space.com/curiosity-rover-nine-years-on-mars]]></url>        <title><![CDATA[9 years on Mars! Curiosity rover marks another anniversary]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="364801"><![CDATA[School of Earth and Atmospheric Sciences (EAS)]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="4896"><![CDATA[College of Sciences]]></keyword>          <keyword tid="166926"><![CDATA[School of Earth and Atmospheric Sciences]]></keyword>          <keyword tid="187439"><![CDATA[Frances Rivera-Hernandez]]></keyword>          <keyword tid="82391"><![CDATA[Antarctica]]></keyword>          <keyword tid="182496"><![CDATA[analog sites]]></keyword>          <keyword tid="188445"><![CDATA[Mars missions]]></keyword>          <keyword tid="80341"><![CDATA[curiosity rover]]></keyword>          <keyword tid="188444"><![CDATA[Miers Valley]]></keyword>          <keyword tid="831"><![CDATA[climate change]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>          <term tid="39541"><![CDATA[Systems]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="649133">  <title><![CDATA[Georgia Tech Joins the U.S. National Science Foundation to Advance AI Research and Education]]></title>  <uid>34602</uid>  <body><![CDATA[<p>For decades, the Georgia Institute of Technology has focused on advancing artificial intelligence through interdisciplinary research and education designed to produce leading-edge technologies. Over the next five years, Georgia Tech will make a substantial investment in AI that includes hiring an additional 100 researchers in the field, further solidifying its standing as a leader in the teaching and discovery of machine learning.</p><p>Today, Georgia Tech received two National Science Foundation (NSF) Artificial Intelligence Research Institutes awards, totaling $40 million. A third award for $20 million was granted to the Georgia Research Alliance (GRA), with Georgia Tech serving as one of the leading academic institutions.</p><p>&ldquo;It is essential that we bring together our best minds to ensure that AI delivers on its promise to create a more prosperous, sustainable, safe, and fair future for everyone,&rdquo; said&nbsp;&Aacute;ngel Cabrera, president of Georgia Tech.&nbsp;&ldquo;These NSF awards recognize Georgia Tech&rsquo;s vast expertise in machine learning and AI and will help us further develop our resources and amplify our impact in these crucial fields.&rdquo;</p><p>Chaouki T. Abdallah, executive vice president for Research at Georgia Tech, concurred, citing major efforts under development to help create a more robust and inclusive future of AI, both on campus and beyond.</p><p>&ldquo;We are incredibly grateful to the NSF for their investment and excited for the opportunities made possible because of this research,&rdquo; he said. &ldquo;At Tech, our mission is to advance technology and improve the human condition, catalyzing research that matters. We invested in a unified approach to interdisciplinary research aligned with industry relevance and societal impact, and these awards demonstrate a clear return on that strategy.&rdquo;</p><p>Collectively, NSF made a <a href="https://www.nsf.gov/news/news_summ.jsp?cntn_id=303176">$220 million investment in 11 new NSF-led Artificial Intelligence Research Institutes</a>.</p><p>&ldquo;I am delighted to announce the establishment of new NSF National AI Research Institutes as we look to expand into all 50 states,&rdquo; said National Science Foundation Director Sethuraman Panchanathan. &ldquo;These Institutes are hubs for academia, industry, and government to accelerate discovery and innovation in AI. Inspiring talent and ideas everywhere in this important area will lead to new capabilities that improve our lives, from medicine to entertainment to transportation and cybersecurity, and position us in the vanguard of competitiveness and prosperity.&rdquo;</p><p>Led by NSF, and in partnership with the U.S. Department of Agriculture&rsquo;s National Institute of Food and Agriculture, the U.S. Department of Homeland Security, Google, Amazon, Intel, and Accenture, the National AI Research Institutes will act as connections in a broader nationwide network to pursue transformational advances in a range of economic sectors, and science and engineering fields &mdash; from food system security to next-generation edge networks. In addition to Georgia Tech and GRA, the University of California San Diego, Duke University, Iowa State University, North Carolina State University, The Ohio State University, and University of Washington are the lead universities included in the 11 AI Institutes.</p><p><strong>The AI Institutes at Georgia Tech </strong></p><p>The three newly established Institutes will address societal challenges, including home care for aging adults; energy, logistics, and supply chains; sustainability; the widening gap in job opportunities; and changing needs in workforce development.</p><p><a href="https://www.cc.gatech.edu/news/649114/new-ai-institute-builds-tech-support-aging">NSF AI Institute for Collaborative Assistance and Responsive Interaction for Networked Groups (AI-CARING)</a> will seek to create a vibrant discipline focused on personalized, collaborative AI systems that will improve quality of care for the aging. The systems will learn individual models of human behavior and how they change over time and use that knowledge to better collaborate and communicate in caregiving environments. Led by Sonia Chernova, associate professor of interactive computing at Georgia Tech, the AI systems will help a growing population of older adults sustain independence, improve quality of life, and increase effectiveness of care coordination across the care network.</p><p>&ldquo;The AI-CARING Institute builds on our existing strengths in AI and in technology for aging. It will create not only novel solutions, but a new generation of researchers focused on the interaction between the two,&rdquo; said Charles Isbell, dean and John P. Imlay Jr. Chair in the College of Computing. &ldquo;Our aim is to build cutting-edge technologies that improve the lives of everyone, and I can&rsquo;t think of a better example than AI-CARING.&rdquo;</p><p><a href="https://www.isye.gatech.edu/news/team-led-isyes-pascal-van-hentenryck-awarded-20m-nsf-grant-fund-center-study-ai-and">NSF AI Institute for Advances in Optimization (AI4Opt)</a> will revolutionize decision-making on a large scale &ndash; fusing AI and mathematical optimization into intelligent systems that will achieve breakthroughs that neither field can achieve independently. Additionally, it will create pathways from high school to undergraduate and graduate education and workforce development training for AI in engineering that will empower a generation of underrepresented students and teachers to join the AI revolution. Led by Pascal Van Hentenryck, A. Russell Chandler III chair and professor in the H. Milton Stewart School of Industrial and Systems Engineering at Georgia Tech, AI4Opt will tackle use cases in energy, resilience and sustainability, supply chains, and circuit design and control.</p><p>&ldquo;AI4Opt, with its focus on AI and optimization, will create new pathways for novel tools that allow better engineering applications to benefit society,&rdquo; said Raheem Beyah, dean of Georgia Tech&rsquo;s College of Engineering and Southern Company Chair. &ldquo;This will allow engineers to build&nbsp;higher quality&nbsp;materials, more efficient renewable resources, new computing systems, and more, while also reinforcing the field as a career path for diverse students.&nbsp;The new institute complements the College&rsquo;s commitment to the integration of AI in engineering disciplines.&rdquo;</p><p><a href="https://www.ic.gatech.edu/news/649137/georgia-tech-will-help-bring-critical-advancements-online-learning-part-multimillion">NSF AI Institute for Adult Learning and Online Education (ALOE)</a> will lead the country and the world in the development of novel AI theories and techniques for enhancing the quality of adult online education, making this mode of learning comparable to that of in-person education in STEM disciplines. Together with partners in the technical college systems and educational technology sector, ALOE will advance online learning using virtual assistants to make education more available, affordable, achievable, and ultimately more equitable. This Institute is led by the GRA, with support from Georgia Tech and the University System of Georgia (USG). Ashok Goel, professor in the School of Interactive Computing at Georgia Tech, will serve as executive director. &nbsp;</p><p>&ldquo;Online education for adults has enormous implications for tomorrow&rsquo;s workforce,&rdquo; said Myk Garn, a GRA senior advisor, assistant vice chancellor for New Models of Learning at the USG, and ALOE&rsquo;s principal investigator. &ldquo;Yet, serious questions remain about the quality of online learning and how best to teach adults online. Artificial intelligence offers a powerful technology for dramatically improving the quality of online learning and adult education.&rdquo;</p><p><strong>The Future of AI at Georgia Tech</strong></p><p>Georgia Tech is poised to strategically reimagine the future of AI. Currently, 66% of Georgia Tech undergraduate computer science students have an academic concentration in Intelligence, focusing on the top-to-bottom computational models of intelligence. The College of Computing&rsquo;s recently launched Ph.D. program in machine learning pulls from faculty in all six colleges across the Institute, and many new courses are being developed that teach AI as a tool for science and engineering. Georgia Tech is exploring the potential creation of a school or college of AI within the next five years, further building on its expansive AI and machine learning footprint. The NSF AI Institutes awards will enable all AI-related academic programs to scale and further differentiate Georgia Tech as a leader in AI education.&nbsp;&nbsp;&nbsp;</p><p>Additionally, the awards will expand and complement ongoing AI research efforts at the Georgia Tech Research Institute (GTRI). In the last fiscal year, GTRI received millions of dollars in research awards from the Department of Defense and other sponsors for AI-affiliated research, and currently, many GTRI researchers are focused on AI-affiliated projects.</p><p>&ldquo;As part of Georgia Tech, GTRI will greatly benefit from the advances in AI that will be achieved as a result of these NSF-funded Institutes, helping us further excel in our aim to deliver leading-edge AI research that benefits national security,&rdquo; said Mark Whorton, GTRI&rsquo;s chief technology officer. &ldquo;GTRI is one of the nation&rsquo;s leading institutes of applied research for national security specifically because of our deep engagement and close affiliation with the academic units of Georgia Tech. AI is a tool we use in conducting larger research objectives, and we believe strongly that these AI Institutes will enable GTRI to put more research into practice.&rdquo;</p><p>&ldquo;Georgia Tech has for decades now been pursuing new AI technologies, and now leads the way in AI that is responsible to the needs of the humans who use it,&rdquo; Isbell said. &ldquo;We have also worked hard to expand access to AI, especially for underrepresented groups. These Institutes will build on that history, expanding both our ability to create new technologies and to train the next generation of innovators. I look forward to watching them grow and develop.&rdquo;</p><p><strong>About the Georgia Institute of Technology</strong></p><p>The Georgia Institute of Technology, or Georgia Tech, is a top 10 public research university developing leaders who advance technology and improve the human condition. The Institute offers business, computing, design, engineering, liberal arts, and sciences degrees. Its nearly 40,000 students, representing 50 states and 149 countries, study at the main campus in Atlanta, at campuses in France and China, and through distance and online learning.&nbsp;As a leading technological university, Georgia Tech is an engine of economic development for Georgia, the Southeast, and the nation, conducting more than $1 billion in research annually for government, industry, and society.</p><p><strong>About the National Science Foundation </strong></p><p>The U.S. National Science Foundation propels the nation forward by advancing fundamental research in all fields of science and engineering. NSF supports research and people by providing facilities, instruments, and funding to support their ingenuity and sustain the U.S. as a global leader in research and innovation. With a fiscal year 2021 budget of $8.5 billion, NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities, and institutions. Each year, NSF receives more than 40,000 competitive proposals and makes about 11,000 new awards. Those awards include support for cooperative research with industry, Arctic and Antarctic research and operations, and U.S. participation in international scientific efforts.</p><p><strong>About the Georgia Research Alliance</strong> </p><p>The Georgia Research Alliance (GRA) helps Georgia&rsquo;s university scientists do more research and start more companies. By expanding research and entrepreneurship capacity at public and private universities, GRA grows the Georgia economy by driving more investment in the state, developing a high-tech workforce, and strengthening Georgia&rsquo;s reputation for innovation.&nbsp;For 30 years, GRA has worked in partnership with the University System of Georgia and the Georgia Department of Economic Development to create the companies and jobs of Georgia&rsquo;s future. Visit <a href="https://gra.org/">GRA.org</a> for more information.</p><p>Contact: Georgia Parmelee | <a href="mailto:georgia.parmelee@gatech.edu">georgia.parmelee@gatech.edu</a> | 404.281.7818</p>]]></body>  <author>Georgia Parmelee</author>  <status>1</status>  <created>1627570839</created>  <gmt_created>2021-07-29 15:00:39</gmt_created>  <changed>1628267020</changed>  <gmt_changed>2021-08-06 16:23:40</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Today, Georgia Tech received two National Science Foundation Artificial Intelligence Research Institutes awards, totaling $40 million.]]></teaser>  <type>news</type>  <sentence><![CDATA[Today, Georgia Tech received two National Science Foundation Artificial Intelligence Research Institutes awards, totaling $40 million.]]></sentence>  <summary><![CDATA[<p>Georgia Tech received two National Science Foundation Artificial Intelligence Research Institutes awards, totaling $40 million. Over the next five years, Georgia Tech will make a substantial investment in AI that includes hiring an additional 100 researchers in the field, further solidifying its standing as a leader in the teaching and discovery of machine learning.</p>]]></summary>  <dateline>2021-07-29T00:00:00-04:00</dateline>  <iso_dateline>2021-07-29T00:00:00-04:00</iso_dateline>  <gmt_dateline>2021-07-29 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[georgia.parmelee@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Georgia Parmelee<br />georgia.parmelee@gatech.edu</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>649130</item>          <item>649128</item>          <item>649129</item>      </media>  <hg_media>          <item>          <nid>649130</nid>          <type>image</type>          <title><![CDATA[AI map]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[AI_map.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/AI_map.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/AI_map.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/AI_map.jpg?itok=nqDc08_p]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[map of AI institutes in US]]></image_alt>                    <created>1627568719</created>          <gmt_created>2021-07-29 14:25:19</gmt_created>          <changed>1627568719</changed>          <gmt_changed>2021-07-29 14:25:19</gmt_changed>      </item>          <item>          <nid>649128</nid>          <type>image</type>          <title><![CDATA[PIs for AI Institues]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[nsf graphic-740px[52].jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/nsf%20graphic-740px%5B52%5D.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/nsf%20graphic-740px%5B52%5D.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/nsf%2520graphic-740px%255B52%255D.jpg?itok=Aub9G9uS]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Pascal Van Hentenryck and Sonia Chernova]]></image_alt>                    <created>1627568604</created>          <gmt_created>2021-07-29 14:23:24</gmt_created>          <changed>1627576219</changed>          <gmt_changed>2021-07-29 16:30:19</gmt_changed>      </item>          <item>          <nid>649129</nid>          <type>image</type>          <title><![CDATA[Ashok headshot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ashok headshot.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ashok%20headshot.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ashok%20headshot.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ashok%2520headshot.jpg?itok=1l3enwcG]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ashok Goel headshot]]></image_alt>                    <created>1627568645</created>          <gmt_created>2021-07-29 14:24:05</gmt_created>          <changed>1627572766</changed>          <gmt_changed>2021-07-29 15:32:46</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="1276"><![CDATA[Georgia Tech Research Institute (GTRI)]]></group>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="443951"><![CDATA[School of Psychology]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="142"><![CDATA[City Planning, Transportation, and Urban Growth]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="144"><![CDATA[Energy]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="142"><![CDATA[City Planning, Transportation, and Urban Growth]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="144"><![CDATA[Energy]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>          <keyword tid="187023"><![CDATA[go-data]]></keyword>          <keyword tid="188087"><![CDATA[go-irim]]></keyword>          <keyword tid="188084"><![CDATA[go-ipat]]></keyword>          <keyword tid="173894"><![CDATA[ML@GT]]></keyword>      </keywords>  <core_research_areas>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>          <term tid="39481"><![CDATA[National Security]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="648192">  <title><![CDATA[Subterranean Investigations]]></title>  <uid>34528</uid>  <body><![CDATA[<p><a href="https://www.news.ucsb.edu/2021/020333/subterranean-investigations"><em>By Sonia Fernandez (UC Santa Barbara)</em></a></p><p>We&rsquo;ve seen robots take to the air, dive beneath the waves and perform all sorts of maneuvers on land. Now, researchers at UC Santa Barbara and Georgia Institute of Technology are exploring a new frontier: the ground beneath our feet. Taking their cues from plants and animals that have evolved to navigate subterranean spaces, they&rsquo;ve developed a fast, controllable soft robot that can burrow through sand. The technology not only enables new applications for fast, precise and minimally invasive movement underground, but also lays mechanical foundations for new types of robots.</p><p>&ldquo;The biggest challenges with moving through the ground are simply the forces involved,&rdquo; said Nicholas Naclerio, a graduate student researcher in the lab of UC Santa Barbara mechanical engineering professor <a href="https://me.ucsb.edu/people/elliot-hawkes">Elliot Hawkes</a> and lead author of a paper on the cover of the journal <a href="https://robotics.sciencemag.org/content/6/55/eabe2922">Science Robotics</a>. Whereas air and water offer little resistance to objects moving through them, he explained, the subterranean world is another story.</p><p>&ldquo;If you&rsquo;re trying to move through the ground, you have to push the soil, sand or other medium out of the way,&rdquo; Naclerio said.</p><p>Fortunately, the natural world provides numerous examples of underground navigation in the form of plants and fungi that build underground networks and animals that have mastered the ability to tunnel directly through granular media. Gaining a mechanical understanding of how plants and animals have mastered subterranean navigation opens up many possibilities for science and technology, according to <a href="https://physics.gatech.edu/user/daniel-goldman">Daniel Goldman</a>, Dunn Family Professor of Physics at Georgia Tech.</p><p>&ldquo;Discovery of principles by which diverse organisms successfully swim and dig within granular media can lead to development of new kinds of mechanisms and robots that can take advantage of such principles,&rdquo; he said. &ldquo;And reciprocally, development of a robot with such capabilities can inspire new animal studies as well as point to new phenomena in the physics of granular substrates.&rdquo;</p><p>The researchers had a good head start with a vine-like soft robot designed in the Hawkes Lab that mimics plants and the way they navigate by growing from their tips, while the rest of the body remains stationary. In the subterranean setting, tip extension, according to the researchers, keeps resisting forces low and localized only to the growing end; if the whole body moved as it grew, friction over the entire surface would increase as more of the robot entered the sand until the robot could no longer move.</p><p>Burrowing animals, meanwhile, serve as inspiration for an additional strategy called granular fluidization, which suspends the particles in a fluid-like state and allows the animal to overcome the high level of resistance presented by sand or loose soil. The southern sand octopus, for instance, expels a jet of water into the ground, and uses its arms to pull itself into the temporarily loosened sand. That ability made its way onto the researchers&rsquo; robot in the form of a tip-based flow device that shoots air into the region just ahead of the growing end, enabling it to move into that area.</p><p>&ldquo;The biggest challenge we found and what took the longest to solve was when we switched to horizontal burrowing, our robots would always surface,&rdquo; Naclerio said. Whereas gases or liquids evenly flow over and under a traveling symmetric object, he explained, in fluidized sand, the distribution of forces is not as balanced, and creates a significant lift force for the horizontally travelling robot. &ldquo;It&rsquo;s much easier to push the sand up and out of the way than it is to compact it down.&rdquo;</p><p>To understand the robot&rsquo;s behavior and the largely unexplored&nbsp; physics of air-aided intrusions, the team took drag and lift measurements as a result of different angles of airflow into from the tip of a solid rod shoved horizontally into sand.</p><p>&ldquo;Frictional force response in granular materials greatly differs from that of Newtonian fluids, as intruding into sand can compact and stress large swaths of terrain in the direction of motion due to high friction,&rdquo; said Andras Karsai, a graduate student researcher in Goldman&rsquo;s lab. &ldquo;To mitigate this, a low-density fluid that lifts and pushes grains away from an intruder will often reduce the net frictional stress it has to overcome.&rdquo;</p><p>Unlike with gas or liquid, where a downward fluid jet would create lift for the travelling object, in sand the downward air flow reduced the lift forces and excavated the sand below the robot&rsquo;s growing tip. This, combined with inspiration from the sandfish lizard, whose wedge-shaped head favors downward movement, allowed the researchers to modulate the resisting forces and keep the robot moving horizontally without rising out of the sand.</p><p>A small, exploratory, soft robot such as this has a variety of applications where shallow burrowing through dry granular media is needed, such as soil sampling, underground installation of utilities and erosion control. Tip extension enables changes in direction, while also allowing the body of the robot to modulate how firmly anchored it is in the medium &mdash; control that could become useful for exploration in low gravity environments. In fact, the team is working on a project with NASA to develop burrowing for the moon or even more distant bodies, like Enceladus, a moon of Jupiter.</p><p>&ldquo;We believe burrowing has the potential to open new avenues and enable new capabilities for extraterrestrial robotics,&rdquo; Hawkes said.</p><p>&nbsp;</p><p><em>Research for this paper was conducted also by Mason Murray-Cooper, Yasemin Ozkan-Aydin and Enes Aydin at Georgia Institute of Technology. </em></p><p><em><strong>Funding: </strong>This work is supported by the NSF (grant nos. 1637446, 1915445, 1915355, and 1935548), the Army Research Office (grant no. GR10005043), the Packard Foundation, and by an Early Career Faculty grant from NASA&rsquo;s Space Technology Research Grants Program. The work of Nicholas D. Naclerio is supported by a NASA Space Technology Research Fellowship. <strong>Competing interests:</strong> Nicholas D. Naclerio and Elliot W. Hawkes are authors of international patent application WO2020060858A1, related to this work. All other authors declare that they have no competing interests. https://doi.org/10.1126/scirobotics.abe2922</em></p>]]></body>  <author>jhunt7</author>  <status>1</status>  <created>1623954233</created>  <gmt_created>2021-06-17 18:23:53</gmt_created>  <changed>1624561199</changed>  <gmt_changed>2021-06-24 18:59:59</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Physicists at Georgia Tech and engineers at UC Santa Barbara are exploring the shallow underground world with a burrowing soft robot]]></teaser>  <type>news</type>  <sentence><![CDATA[Physicists at Georgia Tech and engineers at UC Santa Barbara are exploring the shallow underground world with a burrowing soft robot]]></sentence>  <summary><![CDATA[<p>We&rsquo;ve seen robots take to the air, dive beneath the waves, and perform all sorts of maneuvers on land. Now, physicists at Georgia Tech and engineers at UC Santa Barbara are exploring the shallow underground world with a fast, steerable, burrowing soft robot.</p>]]></summary>  <dateline>2021-06-16T00:00:00-04:00</dateline>  <iso_dateline>2021-06-16T00:00:00-04:00</iso_dateline>  <gmt_dateline>2021-06-16 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Physicists at Georgia Tech and engineers at UC Santa Barbara are exploring the shallow underground world with a burrowing soft robot]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jess@cos.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Sonia Fernandez&nbsp; </strong><br />Senior Writer, Science and Engineering&nbsp;<br />Public Affairs and Communications<br />UC Santa Barbara<br />(805) 893-4765<br />sonia.fernandez@ucsb.edu</p><p><strong>Jess Hunt-Ralston</strong><br />Director of Communications<br />College of Sciences<br />Georgia Institute of Technology<br />(404) 385-5207<br />jess@cos.gatech.edu</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>648193</item>          <item>648195</item>          <item>648194</item>      </media>  <hg_media>          <item>          <nid>648193</nid>          <type>image</type>          <title><![CDATA[Researchers have developed a fast, steerable, burrowing soft robot (Photo: UC Santa Barbara)]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[DSC00508 2.JPG]]></image_name>            <image_path><![CDATA[/sites/default/files/images/DSC00508%202.JPG]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/DSC00508%202.JPG]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/DSC00508%25202.JPG?itok=DRigs_69]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1623954918</created>          <gmt_created>2021-06-17 18:35:18</gmt_created>          <changed>1623954918</changed>          <gmt_changed>2021-06-17 18:35:18</gmt_changed>      </item>          <item>          <nid>648195</nid>          <type>image</type>          <title><![CDATA[A small, exploratory, soft robot such as this has a variety of applications where shallow burrowing through dry granular media is needed, such as soil sampling, underground installation of utilities and erosion control (Photo: UC Santa Barbara)]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[IMG_1370.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/IMG_1370.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/IMG_1370.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/IMG_1370.jpeg?itok=QbOdF7LZ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1623955405</created>          <gmt_created>2021-06-17 18:43:25</gmt_created>          <changed>1623955405</changed>          <gmt_changed>2021-06-17 18:43:25</gmt_changed>      </item>          <item>          <nid>648194</nid>          <type>image</type>          <title><![CDATA[Science Robotics, June 2021 Online Cover: Groundbreaking Soft Robot (Credit: Sicheng Wang)]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Science Robotics cover June 2021.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Science%20Robotics%20cover%20June%202021.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Science%20Robotics%20cover%20June%202021.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Science%2520Robotics%2520cover%2520June%25202021.jpg?itok=YmRlqvpY]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1623955296</created>          <gmt_created>2021-06-17 18:41:36</gmt_created>          <changed>1623955296</changed>          <gmt_changed>2021-06-17 18:41:36</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.news.ucsb.edu/2021/020333/subterranean-investigations]]></url>        <title><![CDATA[UCSB The Current: Subterranean Investigations]]></title>      </link>          <link>        <url><![CDATA[https://crablab.gatech.edu/]]></url>        <title><![CDATA[Daniel Goldman: CRAB Lab]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="126011"><![CDATA[School of Physics]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="47881"><![CDATA[Dan Goldman]]></keyword>          <keyword tid="12040"><![CDATA[Daniel Goldman]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="186871"><![CDATA[soft robotics]]></keyword>          <keyword tid="188095"><![CDATA[burrowing robot]]></keyword>          <keyword tid="166937"><![CDATA[School of Physics]]></keyword>          <keyword tid="7688"><![CDATA[biomimicry]]></keyword>          <keyword tid="188096"><![CDATA[UC Santa Barbara]]></keyword>          <keyword tid="188097"><![CDATA[Elliot Hawkes]]></keyword>          <keyword tid="188098"><![CDATA[Science Robotics]]></keyword>          <keyword tid="188099"><![CDATA[granular substrates]]></keyword>          <keyword tid="188100"><![CDATA[air-aided intrusions]]></keyword>          <keyword tid="188101"><![CDATA[Andras Karsai]]></keyword>          <keyword tid="188102"><![CDATA[soil sampling]]></keyword>          <keyword tid="188103"><![CDATA[underground installation of utilities]]></keyword>          <keyword tid="188104"><![CDATA[erosion control]]></keyword>          <keyword tid="188105"><![CDATA[burrowing for the moon]]></keyword>          <keyword tid="408"><![CDATA[NASA]]></keyword>          <keyword tid="188106"><![CDATA[ARO]]></keyword>          <keyword tid="171847"><![CDATA[Army Research Office]]></keyword>          <keyword tid="363"><![CDATA[NSF]]></keyword>          <keyword tid="187423"><![CDATA[go-bio]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="648161">  <title><![CDATA[If I Had a Hammer: A Simple Tool to Enable Remote Neurological Examinations]]></title>  <uid>27863</uid>  <body><![CDATA[<p>In the early weeks of the COVID-19 pandemic, clinics and patients alike began cancelling all non-urgent appointments and procedures in order to slow the spread of the coronavirus. A boom in telemedicine was borne out of necessity as healthcare workers, administrators, and scientists creatively advanced technologies to fill a void in care.</p><p>During this time, Georgia Institute of Technology professor Jun Ueda and Ph.D. student Waiman Meinhold, along with their collaborators at NITI-ON Co. and Tohoku University in Japan, began to explore how they might contribute. By employing their previously engineered &ldquo;smart&rdquo; tendon hammer and developing a mobile app to accompany it, Meinhold, Ueda, and their collaborators devised a system that enables the deep tendon reflex exam to be performed remotely, filling a gap in neurological healthcare delivery.</p><p>The deep tendon reflex exam is both a basic and crucial part of neurological assessment and is often the first step in identifying neurological illnesses. The traditional exam consists of two main parts. First, using a silicone hammer, a physician taps on a patient&rsquo;s tendon to trigger a reflex response. Next, the physician grades the reflex on a numerical scale. To characterize the reflex, a trained physician relies primarily on previous experience, visual cues, and the &ldquo;feel&rdquo; of the hammer rebounding in their hand. Until now, the physical act of reflex elicitation has been completely out of reach for telemedicine. Hitting the correct spot on the tendon is crucial and is necessary in order to elicit a proper reflex response.</p><p>According to Meinhold and Ueda&rsquo;s research, a patient&rsquo;s caretaker or family member may be able to easily step in to assist with this critical component of the neurological exam. They will simply need to obtain the smart tendon hammer and download the accompanying mobile application for data analysis.</p><p>To make this advance possible, Meinhold and Ueda modified a standard commercially available reflex hammer by furnishing it with a small wireless Inertial Measurement Unit (IMU) capable of measuring and streaming the hammer&rsquo;s acceleration data. In the course of their research, Meinhold and Ueda proved that by taking the hammer&rsquo;s acceleration measurements from on-tendon and off-tendon locations and running them through a classification algorithm, they can reliably distinguish whether or not the hammer has hit the correct spot.</p><p>How would this remote exam work, exactly? Equipped with the smart hammer, the lay person uses the app to select which tendon they will test (bicep, Achilles, patellar, etc.), which calls up the pre-programmed &ldquo;classifier&rdquo; for that particular tendon. These &ldquo;classifiers&rdquo; are basic forms of artificial intelligence that use aggregated acceleration data collected from experiments to categorize each tap into one of two categories: correct or incorrect. The lay person then uses the smart tendon hammer to administer a tap on the patient&rsquo;s tendon. As contact is made, the hammer streams acceleration data via Bluetooth to the app, which interprets the data and gives instant feedback to the user about whether they have tapped the correct location. In addition, colored LEDs on the hammer indicate a tap&rsquo;s success, with a green light indicating a correct tap and a red light indicating an incorrect tap. The user is prompted to keep tapping until they log several correct taps.</p><p>Crucially, Meinhold and Ueda showed that lay people can adequately perform tendon tapping. Their research appeared in the peer-reviewed journal<em> Frontiers in Robotics and AI</em> on March 16, 2021. There, moving their smart hammer closer to clinical implementation, Meinhold and Ueda directly compared the manual tapping variability between a novice and a trained clinician. The results were reassuring. The team found that while novices had more variability in their tapping than clinicians, their skill level was adequate. They reliably elicited tendon reflexes. Their research demonstrates that a tool is within reach to allow for remote implementation of deep tendon reflex exam.</p><p>But could lay users also aid in grading reflexes? The work by Meinhold and Ueda suggests that non-experts may be able to help. To investigate this, they tested a simple training scheme. They provided participants and physicians with a training video on how to grade reflexes, and then assigned unlabeled videos for them to score. They found that while novices were able to grade reflexes with relatively low error rates, expert physicians outperformed them. Physicians excelled at grading from video, making no errors. To access this expert grading, Meinhold and Ueda envision that through the app, lay users could upload videos of the tendon tapping and reflex response. Physicians could then easily grade the patient&rsquo;s reflexes from their office.</p><p>By revolutionizing a traditional neurological assessment procedure, the smart hammer system developed at Georgia Tech is poised to kick-start a new wave in telemedicine.</p><p><em><strong>Text - Catherine Barzler<br />Images &ndash; Christa Ernst</strong></em></p><p><a href="https://www.frontiersin.org/articles/10.3389/frobt.2021.618656/full">A Smart Tendon Hammer System for Remote Neurological Examination</a><br />W. Meinhold, Y.Yamakawa, H. Honda, T. Mori, S. Izumi and Jun Ueda<br />Fontiers in Robotics and AI, #8, 2021<br />DOI=10.3389/frobt.2021.618656&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</p><p>&nbsp;</p>]]></body>  <author>Christa Ernst</author>  <status>1</status>  <created>1623860265</created>  <gmt_created>2021-06-16 16:17:45</gmt_created>  <changed>1624279079</changed>  <gmt_changed>2021-06-21 12:37:59</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[By employing their previously engineered “smart” tendon hammer and developing a mobile app to accompany it, Meinhold, Ueda, and their collaborators devised a system that enables the deep tendon reflex exam to be performed remotely...]]></teaser>  <type>news</type>  <sentence><![CDATA[By employing their previously engineered “smart” tendon hammer and developing a mobile app to accompany it, Meinhold, Ueda, and their collaborators devised a system that enables the deep tendon reflex exam to be performed remotely...]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2021-06-16T00:00:00-04:00</dateline>  <iso_dateline>2021-06-16T00:00:00-04:00</iso_dateline>  <gmt_dateline>2021-06-16 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[A Smart Tendon Hammer System for Remote Neurological Examination]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[christa.ernst@research.gatech.edu]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>648159</item>          <item>648160</item>      </media>  <hg_media>          <item>          <nid>648159</nid>          <type>image</type>          <title><![CDATA[Smart Tendon Hammer]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Tendon Hammer for News Item 1280x720.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Tendon%20Hammer%20for%20News%20Item%201280x720.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Tendon%20Hammer%20for%20News%20Item%201280x720.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Tendon%2520Hammer%2520for%2520News%2520Item%25201280x720.png?itok=CZTnRIVF]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[A Smart Tendon Hammer System for Remote Neurological Examination]]></image_alt>                    <created>1623859367</created>          <gmt_created>2021-06-16 16:02:47</gmt_created>          <changed>1635275774</changed>          <gmt_changed>2021-10-26 19:16:14</gmt_changed>      </item>          <item>          <nid>648160</nid>          <type>image</type>          <title><![CDATA[Jun Ueda Smart Hammer]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Jun Ueda George W. Woodruff School of Mechanical Engineering  IEN IRIM 6-15-21 Headshot CME.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Jun%20Ueda%20George%20W.%20Woodruff%20School%20of%20Mechanical%20Engineering%20%20IEN%20IRIM%206-15-21%20Headshot%20CME.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Jun%20Ueda%20George%20W.%20Woodruff%20School%20of%20Mechanical%20Engineering%20%20IEN%20IRIM%206-15-21%20Headshot%20CME.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Jun%2520Ueda%2520George%2520W.%2520Woodruff%2520School%2520of%2520Mechanical%2520Engineering%2520%2520IEN%2520IRIM%25206-15-21%2520Headshot%2520CME.png?itok=Dv9LW97M]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Jun Ueda, George W. Woodruff School of Mechanical Engineering Professor]]></image_alt>                    <created>1623859676</created>          <gmt_created>2021-06-16 16:07:56</gmt_created>          <changed>1635275612</changed>          <gmt_changed>2021-10-26 19:13:32</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="198081"><![CDATA[Georgia Electronic Design Center (GEDC)]]></group>          <group id="217141"><![CDATA[Georgia Tech Materials Institute]]></group>          <group id="197261"><![CDATA[Institute for Electronics and Nanotechnology]]></group>          <group id="142761"><![CDATA[IRIM]]></group>          <group id="1271"><![CDATA[NanoTECH]]></group>          <group id="213771"><![CDATA[The Center for MEMS and Microsystems Technologies]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="188086"><![CDATA[remote diagnostics]]></keyword>          <keyword tid="188087"><![CDATA[go-irim]]></keyword>          <keyword tid="166968"><![CDATA[the Institute for Electronics and Nanotechnology]]></keyword>          <keyword tid="13887"><![CDATA[Jun Ueda]]></keyword>          <keyword tid="541"><![CDATA[Mechanical Engineering]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="187915"><![CDATA[go-researchnews]]></keyword>      </keywords>  <core_research_areas>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="645616">  <title><![CDATA[Control System Helps Several Drones Team Up to Deliver Heavy Packages ]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Many parcel delivery drones of the future are expected to handle packages weighing five pounds or less, a restriction that would allow small, standardized UAVs to handle a large percentage of the deliveries now done by ground vehicles. But will that relegate heavier packages to slower delivery by conventional trucks and vans?</p><p>A research team at the Georgia Institute of Technology has developed a modular solution for handling larger packages without the need for a complex fleet of drones of varying sizes. By allowing teams of small drones to collaboratively lift objects using an adaptive control algorithm, the strategy could allow a wide range of packages to be delivered using a combination of several standard-sized vehicles.</p><p>Beyond simplifying the drone fleet, the work could provide more robust drone operations and reduce the noise and safety concerns involved in operating large autonomous UAVs in populated areas. In addition to commercial package delivery, the system might also be used by the military to resupply small groups of soldiers in the field.</p><p>&ldquo;A delivery truck could carry a dozen drones in the back, and depending on how heavy a particular package is, it might use as many as six drones to carry the package,&rdquo; said <a href="https://aerospace.gatech.edu/people/jonathan-rogers">Jonathan Rogers</a>, the Lockheed Martin Associate Professor of Avionics Integration in Georgia Tech&rsquo;s<a href="https://aerospace.gatech.edu/"> Daniel Guggenheim School of Aerospace Engineering</a>. &ldquo;That would allow flexibility in the weight of the packages that could be delivered and eliminate the need to build and maintain several different sizes of delivery drones.&rdquo;</p><p>The research was supported, in part, by a National Science Foundation graduate student fellowship and by the Hives independent research and development program of the Georgia Tech Research Institute. A paper on the research has been submitted to the <em>Journal of Aircraft</em>.</p><p>A centralized computer system developed by graduate student Kevin Webb would monitor each of the drones lifting a package, sharing information about their location and the thrust being provided by their motors. The control system would coordinate the issuance of commands for navigation and delivery of the package.</p><p>&ldquo;The idea is to make multi-UAV cooperative flight easy from the user perspective,&rdquo; Rogers said. &ldquo;We take care of the difficult issues using the onboard intelligence, rather than expecting a human to precisely measure the package weight, center of gravity, and drone relative positions. We want to make this easy enough so that a package delivery driver could operate the system consistently.&rdquo;</p><p>The challenges of controlling a group of robots connected together to lift a package is more complex in many ways than controlling a swarm of robots that fly independently.</p><p>&ldquo;Most swarm work involves vehicles that are not connected, but flying in formations,&rdquo; Rogers said. &ldquo;In that case, the individual dynamics of a specific vehicle are not constrained by what the other vehicles are doing. For us, the challenge is that the vehicles are being pulled in different directions by what the other vehicles connected to the package are doing.&rdquo;&nbsp;</p><p>The team of drones would autonomously connect to a docking structure attached to a package, using an infrared guidance system that eliminates the need for humans to attach the vehicles. That could come in handy for drones sent to retrieve packages that a customer is returning. By knowing how much thrust they are producing and the altitude they are maintaining, the drone teams could even estimate the weight of the package they&rsquo;re picking up.</p><p>Webb and Rogers have built a demonstration in which four small quadrotor drones work together to lift a box that&rsquo;s 2 feet by 2 feet by 2 feet and weighs 12 pounds. The control algorithm isn&rsquo;t limited to four vehicles and could manage &ldquo;as many vehicles as you could put around the package,&rdquo; Rogers said.</p><p>For the military, the modular cargo system could allow squads of soldiers at remote locations to be resupplied without the cost or risk of operating a large autonomous helicopter. A military UAV package retrieval team could be made up of individual vehicles carried by each soldier.</p><p>&ldquo;That would distribute a big lifting capability in smaller packages, which equates to small drones that could be used to team up,&rdquo; Rogers said. &ldquo;Putting small drones together would allow them to do bigger things than they could do individually.&rdquo;</p><p>Bringing multiple vehicles together creates a more difficult control challenge, but Rogers argues the benefits are worth the complexity. &ldquo;The idea of having multiple machines working together provides better scalability than building a larger device every time you have a larger task,&rdquo; he said. &ldquo;We think this is the right way to fill that gap.&rdquo;</p><p>Using multiple drones to carry a heavy package could also allow more redundancy in the delivery system. Should one of the drones fail, the others should be able to pick up the load &ndash; an issue managed by the central control system. That part of the control strategy hasn&rsquo;t yet been tested, but it is part of Rogers&rsquo; plan for future development of the system.</p><p>More research is also needed on the docking system that connects the drones to packages. The structures will have to be made strong and rigid enough to connect to and lift the packages, while being inexpensive enough to be disposable.</p><p>&ldquo;I think the major technologies are already here, and given an adequate investment, a system could be fielded within five years to deliver packages with multiple drones,&rdquo; Rogers said. &ldquo;It&rsquo;s not a technical challenge as much as it is a regulatory issue and a question of societal acceptance.&rdquo;</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contacts</strong>: John Toon (404-894-6986) (jtoon@gatech.edu) or Anne Wainscott-Sargent (404-435-5784) (asargent7@gatech.edu).</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1616434842</created>  <gmt_created>2021-03-22 17:40:42</gmt_created>  <changed>1616434922</changed>  <gmt_changed>2021-03-22 17:42:02</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have developed a control system that will enable teams of drones to carry heavy packages.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have developed a control system that will enable teams of drones to carry heavy packages.]]></sentence>  <summary><![CDATA[<p>A research team at the Georgia Institute of Technology has developed a modular solution for drone delivery of larger packages without the need for a complex fleet of drones of varying sizes. By allowing teams of small drones to collaboratively lift objects using an adaptive control algorithm, the strategy could allow a wide range of packages to be delivered using a combination of several standard-sized vehicles.</p>]]></summary>  <dateline>2021-03-22T00:00:00-04:00</dateline>  <iso_dateline>2021-03-22T00:00:00-04:00</iso_dateline>  <gmt_dateline>2021-03-22 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>645610</item>          <item>645611</item>          <item>645612</item>          <item>645613</item>      </media>  <hg_media>          <item>          <nid>645610</nid>          <type>image</type>          <title><![CDATA[Four drones team up to lift a package]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[drones3.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/drones3.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/drones3.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/drones3.jpg?itok=vFrUGP8b]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Four drones attached to a package]]></image_alt>                    <created>1616433879</created>          <gmt_created>2021-03-22 17:24:39</gmt_created>          <changed>1616433879</changed>          <gmt_changed>2021-03-22 17:24:39</gmt_changed>      </item>          <item>          <nid>645611</nid>          <type>image</type>          <title><![CDATA[Drones collaborate to lift package]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[drone-flying.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/drone-flying.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/drone-flying.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/drone-flying.jpg?itok=dywFy2Ly]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Four drones lift a 12-pound package]]></image_alt>                    <created>1616433982</created>          <gmt_created>2021-03-22 17:26:22</gmt_created>          <changed>1616433982</changed>          <gmt_changed>2021-03-22 17:26:22</gmt_changed>      </item>          <item>          <nid>645612</nid>          <type>image</type>          <title><![CDATA[Adjusting drone control system]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[drones2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/drones2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/drones2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/drones2.jpg?itok=fbzAz2YQ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researcher adjusting control system]]></image_alt>                    <created>1616434064</created>          <gmt_created>2021-03-22 17:27:44</gmt_created>          <changed>1616434064</changed>          <gmt_changed>2021-03-22 17:27:44</gmt_changed>      </item>          <item>          <nid>645613</nid>          <type>image</type>          <title><![CDATA[Monitoring the algorithm controlling the drones]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[drones4.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/drones4.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/drones4.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/drones4.jpg?itok=Mrqfnho2]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Monitoring the control system]]></image_alt>                    <created>1616434165</created>          <gmt_created>2021-03-22 17:29:25</gmt_created>          <changed>1616434165</changed>          <gmt_changed>2021-03-22 17:29:25</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1500"><![CDATA[UAV]]></keyword>          <keyword tid="187353"><![CDATA[drone]]></keyword>          <keyword tid="172051"><![CDATA[control system]]></keyword>          <keyword tid="187354"><![CDATA[parcel delivery]]></keyword>          <keyword tid="187355"><![CDATA[package delivery]]></keyword>          <keyword tid="7264"><![CDATA[autonomous]]></keyword>      </keywords>  <core_research_areas>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="645131">  <title><![CDATA[Georgia Tech Receives $2.2M in Toyota Research Institute Robotics Funding]]></title>  <uid>35692</uid>  <body><![CDATA[<p>The Georgia Institute of Technology is one of 16 academic institutions selected for <a href="https://www.tri.global/">Toyota Research Institute&rsquo;s (TRI) </a>collaborative research program.&nbsp;</p><p>Founded in 2015 and now in its second wave of investment with top universities, TRI will invest more than $75 million over the next five years. The university partners will focus on breakthroughs around tough technological challenges in key research priority areas of automated driving, robotics, and machine-assisted cognition.</p><p>&ldquo;Georgia Tech is honored to work closely with TRI to advance robotics in key fields. It&rsquo;s an exciting start to what we hope will be a longer-term collaboration,&rdquo; said <a href="https://www.cc.gatech.edu/~seth/">Seth Hutchinson</a>, executive director of Georgia Tech&rsquo;s <a href="http://robotics.gatech.edu/">Institute for Robotics and Intelligent Machines</a> and professor and KUKA Chair for Robotics in the <a href="https://ic.gatech.edu/">School of Interactive Computing</a>.&nbsp;</p><p>&quot;This new phase of university research is about pushing even further and doing so with a broader, more diverse set of stakeholders. To get to the best ideas, collaboration is critical. And we sought out universities like Georgia Tech that share our vision of using AI for human amplification and societal good. The funded projects will contribute to two TRI focus areas: automated driving and home robotics,&quot; said Eric Krotkov, TRI chief science officer.</p><p>The two Georgia Tech projects total $2.2M over the next three years. Under the agreement, each team will be paired with TRI researchers, who will serve as co-investigators.&nbsp;</p><p><strong>An Outdoor MiniCity to Test Autonomous Driving&nbsp;&nbsp; &nbsp;</strong><br />The first research project aims to make it easier for universities to test autonomous vehicles, building on Georgia Tech&rsquo;s <a href="https://autorally.github.io/media/">AutoRally</a> platform. Georgia Tech researchers use this small-scale autonomous dirt track to test aggressive driving. The car can control turns and calculate for on-course obstacles at speeds approaching 20 miles per hour. The software and simulation environment could help make future self-driving cars safer under similar hazardous road conditions. Georgia Tech researchers will build on this platform to develop a scale-model MiniCity environment to develop and test autonomy algorithms.</p><p>Current autonomous vehicle testing is done by industry using full-size vehicles on city streets &ndash; an expensive proposition not viable for the broader academic research community.<br />&ldquo;There&rsquo;s a barrier to entry for the science in the field,&rdquo; said principal investigator <a href="https://rehg.org/contact/">James Rehg</a>, a professor in the <a href="https://www.ic.gatech.edu/">School of Interactive Computing</a>. &ldquo;Our platform uses a one-fifth scale vehicle, freeing us to do research at lower cost and without taking any risks &ndash; we can crash our car and it&rsquo;s inexpensive to repair and nobody gets hurt.&rdquo;</p><p>The autonomous cars will navigate the MiniCity and avoid hazards while obeying speed and traffic rules. Sensors will enable the cars to sense obstacles and make decisions on how fast to drive or how to steer. &ldquo;We are addressing the issue of reproducibility of autonomous driving in a test environment,&rdquo; Rehg said.&nbsp;</p><p>Massachusetts Institute of Technology (MIT), one of TRI&rsquo;s three original funded universities, is leading the research project. MIT operates an indoor autonomous driving track that simulates paved city streets. With Georgia Tech&rsquo;s outdoor track, researchers can then see how autonomous cars perform over gravel, dirt, and other more realistic driving conditions.&nbsp;</p><p>According to Rehg, autonomy testing presents unique challenges. &ldquo;There&rsquo;s a reason you get a driver&rsquo;s test &mdash; you have to understand the variety of situations that can arise in driving and the rules, and you must understand how the context can change and make all the right decisions for safety.&rdquo;&nbsp;</p><p>With the MiniCity, Rehg and fellow investigator Evangelos Theodorou, an associate professor in the Daniel Guggenheim School of Aerospace Engineering, hope to develop a standardized testbed and protocol for testing and then invite academic teams to compete and measure the driving performance of their vehicles. &nbsp;</p><p><strong>Human-assist Robots to Help People Age in Place</strong><br />Georgia Tech&rsquo;s other TRI research project involves robotics that can assist older adults. It reflects Toyota and TRI&rsquo;s priority to help older adults age in place.</p><p>&ldquo;It&rsquo;s really a powerful thing to have independence and be able to do things for yourself,&rdquo; said the project&rsquo;s principal investigator,<a href="https://petitinstitute.gatech.edu/charles-kemp"> Charlie Kemp</a>, associate professor in the <a href="https://www.bme.gatech.edu/">Wallace H. Coulter Department of Biomedical Engineering</a> and adjunct associate professor in the <a href="https://ic.gatech.edu/">School of Interactive Computing</a>. Kemp also is a co-founder and the chief technology officer of Hello Robot Inc., a company that has commercialized robotic assistance technologies initially developed in his lab. &nbsp;</p><p>Looking at the aging issue, Kemp and co-PI Hutchinson will examine how to take advantage of complementary characteristics that can lead to better physical collaboration between an individual and a robot.</p><p>&ldquo;We are asking, &lsquo;How can an individual and a particular robot best work together?&rsquo; &lsquo;How do we individualize the robot to the person to give them a better quality of life?&rsquo;&rdquo; Kemp said.</p><p>They plan to take a modeling approach initially using physics simulations and, later, conducting studies with young able-bodied participants, healthy older adults, and older adults with impairments.&nbsp;</p><p>The researchers will use sensing technology &ndash; including pressure sensors on beds that pinpoint a person&rsquo;s body position and movement, as well as capacitive sensors that help the robot to better perceive a person&rsquo;s body position up close. Such information can help with activities like dressing. &nbsp;</p><p>&ldquo;It&rsquo;s a very intimate interaction between the robot and the human,&rdquo; Hutchinson said.</p><p>Both investigators share TRI&rsquo;s view that robotics that can assist older adults with daily living could make a major impact in the well-being of an increasingly graying population. In fact, during the next three decades, the global population over the age of 65 is projected to more than double. Japan, headquarters for Toyota, has the highest proportion of older citizens of any country in the world, with one in four people over 65. &nbsp;</p><p>&ldquo;We have talked about robots helping older adults for decades and we&rsquo;re still not there,&rdquo; said Kemp. &ldquo;There&rsquo;s a real opportunity to help people. As I get older, I&rsquo;d love for this technology to be there for me and for my loved ones. While we still have a long way to go, the research can get us closer,&rdquo; he added.</p><p>Hutchinson acknowledged that it will take time before people see robotic assistive technologies in hospitals or people&rsquo;s homes, but the potential is there.</p><p>&ldquo;What is most exciting about the TRI project is it has the potential to show up in people&rsquo;s homes because TRI is invested in getting it there. And that means our research could really make an impact on a broad scale instead of only touching research journals or elite practitioners in the field,&rdquo; he said.</p><p><strong>Robotics Success Takes a Village&nbsp;</strong><br />The investigators agree that Georgia Tech&rsquo;s multidisciplinary focus within robotics is a strength that will serve them well in their work with TRI, and especially in the future when autonomy goes mainstream.</p><p>&ldquo;If you think about what it&rsquo;s going to take for autonomous vehicles to really exist in the world on a large scale and deliver passengers in high volumes, it&rsquo;s going to require all those things &ndash; engineering, science policy, law, and ethics &ndash; all those disciplines coming together,&rdquo; said Rehg.<br />Kemp agreed, noting that since founding his <a href="https://sites.gatech.edu/hrl/">Healthcare Robotics Lab</a> in 2007, he&rsquo;s attracted students from across engineering disciplines &mdash; from mechanical and computing to electrical, aerospace, and biomedical. &nbsp;<br />&ldquo;It&#39;s definitely something that&#39;s distinctive about Georgia Tech &mdash; it&#39;s a real strength,&rdquo; he said.</p><p><em>Charlie Kemp owns equity in Hello Robot and is an inventor of Georgia Tech intellectual property (IP) licensed by Hello Robot. Consequently, he benefits from increases in the value of Hello Robot and receives royalties via Georgia Tech for sales made by Hello Robot. The terms of this arrangement have been reviewed and approved by Georgia Tech in accordance with its conflict-of-interest policies.</em></p>]]></body>  <author>Anne Sargent</author>  <status>1</status>  <created>1615239114</created>  <gmt_created>2021-03-08 21:31:54</gmt_created>  <changed>1615424878</changed>  <gmt_changed>2021-03-11 01:07:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers will collaborate with TRI on two research projects: the first to advance autonomous vehicle testing and the second,  to improve the way robots assist older adults with daily living tasks.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers will collaborate with TRI on two research projects: the first to advance autonomous vehicle testing and the second,  to improve the way robots assist older adults with daily living tasks.]]></sentence>  <summary><![CDATA[<p>Georgia Tech researchers will create an outdoor minicity to test autonomous driving in an urban area, while another team will focus on home robotics to help aging populations and robots better collaborate.</p>]]></summary>  <dateline>2021-03-08T00:00:00-05:00</dateline>  <iso_dateline>2021-03-08T00:00:00-05:00</iso_dateline>  <gmt_dateline>2021-03-08 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Research Projects to Advance Autonomous Driving Testbed, Human-Robot Collaboration]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[asargent7@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Anne Wainscott-Sargent</p><p>Research News</p><p>(404-435-5784)&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>645119</item>          <item>645222</item>      </media>  <hg_media>          <item>          <nid>645119</nid>          <type>image</type>          <title><![CDATA[AutoRally ]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[thumbnail_test-track.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/thumbnail_test-track.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/thumbnail_test-track.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/thumbnail_test-track.jpg?itok=iHqEmedQ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1615236952</created>          <gmt_created>2021-03-08 20:55:52</gmt_created>          <changed>1615240710</changed>          <gmt_changed>2021-03-08 21:58:30</gmt_changed>      </item>          <item>          <nid>645222</nid>          <type>image</type>          <title><![CDATA[Stretch with Professor Charlie Kemp]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[original_with_post_processing_20210310_1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/original_with_post_processing_20210310_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/original_with_post_processing_20210310_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/original_with_post_processing_20210310_1.jpg?itok=A66z4ZZp]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Professor Charlie Kemp with his robot, Stretch.]]></image_alt>                    <created>1615424723</created>          <gmt_created>2021-03-11 01:05:23</gmt_created>          <changed>1615424723</changed>          <gmt_changed>2021-03-11 01:05:23</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="187238"><![CDATA[Toyota Research Institute]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="174666"><![CDATA[autonomous driving]]></keyword>          <keyword tid="187244"><![CDATA[human-robot collaboration]]></keyword>          <keyword tid="15273"><![CDATA[aging in place]]></keyword>          <keyword tid="5525"><![CDATA[assistive technologies]]></keyword>          <keyword tid="14786"><![CDATA[James Rehg]]></keyword>          <keyword tid="169760"><![CDATA[Seth Hutchinson]]></keyword>          <keyword tid="79401"><![CDATA[Charles Kemp]]></keyword>          <keyword tid="126571"><![CDATA[go-PetitInstitute]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39511"><![CDATA[Public Service, Leadership, and Policy]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="644073">  <title><![CDATA[Collective Worm and Robot “Blobs” Protect Individuals, Swarm Together]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Individually, California blackworms live an unremarkable life eating microorganisms in ponds and serving as tropical fish food for aquarium enthusiasts. But together, tens, hundreds, or thousands of the centimeter-long creatures can collaborate to form a &ldquo;worm blob,&rdquo; a shape-shifting living liquid that collectively protects its members from drying out and helps them escape threats such as excessive heat.</p><p>While other organisms form collective flocks, schools, or swarms for such purposes as mating, predation, and protection, the Lumbriculus variegatus worms are unusual in their ability to braid themselves together to accomplish tasks that unconnected individuals cannot. A new study reported by researchers at the Georgia Institute of Technology describes how the worms self-organize to act as entangled &ldquo;active matter,&rdquo; creating surprising collective behaviors whose principles have been applied to help blobs of simple robots evolve their own locomotion.</p><p>The research, supported by the National Science Foundation and the Army Research Office, was reported Feb. 5 in the journal <em>Proceedings of the National Academy of Sciences</em>. Findings from the work could help developers of swarm robots understand how emergent behavior of entangled active matter can produce unexpected, complex, and potentially useful mechanically driven behaviors.</p><p><strong>Collective Behavior in Worms</strong></p><p>The spark for the research came several years ago in California, where <a href="https://www.chbe.gatech.edu/people/saad-bhamla">Saad Bhamla</a> was intrigued by blobs of the worms he saw in a backyard pond.</p><p>&ldquo;We were curious about why these worms would form these living blobs,&rdquo; said Bhamla, an assistant professor in Georgia Tech&rsquo;s <a href="https://www.chbe.gatech.edu/">School of Chemical and Biomolecular Engineering</a>. &ldquo;We have now shown through mathematical models and biological experiments that forming the blobs confers a kind of collective decision-making that enables worms in a larger blob to survive longer against desiccation. We also showed that they can move together, a collective behavior that&rsquo;s not done by any other organisms we know of at the macro scale.&rdquo;</p><p>Such collective behavior in living systems is of interest to researchers exploring ways to apply the principles of living systems to human-designed systems such as swarm robots, in which individuals must also work together to create complex behaviors.</p><p>&ldquo;The worm blob collective turns out to have capabilities that are more than what the individuals have, a wonderful example of biological emergence,&rdquo; said <a href="https://physics.gatech.edu/user/daniel-goldman">Daniel Goldman</a>, a Dunn Family Professor in Georgia Tech&rsquo;s <a href="https://physics.gatech.edu/">School of Physics</a>, who studies the physics of living systems.</p><p><strong>Why the Worms Form Blobs</strong></p><p>The worm blob system was studied extensively by Yasemin Ozkan-Aydin, a research associate in Goldman&rsquo;s lab. Using bundles of worms she originally ordered from a California aquarium supply company &ndash; and now raises in Georgia Tech labs &ndash; Ozkan-Aydin put the worms through several experiments. Those included development of a &ldquo;worm gymnasium&rdquo; that allowed her to measure the strength of individual worms, knowledge important to understanding how small numbers of the creatures can move an entire blob.</p><p>She started by taking the aquatic worms out of the water and watching their behavior. First, they individually began searching for water. When that search failed, they formed a ball-shaped blob in which individuals took turns on the outer surface exposed to the air where evaporation was taking place &ndash; behavior she theorized would reduce the effect of evaporation on the collective. By studying the blobs, she learned that worms in a blob could survive out of water 10 times longer than individual worms could.</p><p>&ldquo;They would certainly want to reduce desiccation, but the way in which they would do this is not obvious and points to a kind of collective intelligence in the system,&rdquo; said Goldman. &ldquo;They are not just surface-minimizing machines. They are looking to exploit good conditions and resources.&rdquo;</p><p><strong>Using Blobs to Escape Threats</strong></p><p>Ozkan-Aydin also studied how worm blobs responded to both temperature gradients and intense light. The worms need a specific range of temperatures to survive and dislike intense light. When a blob was placed on a heated plate, it slowly moved away from the hotter portion of the plate to the cooler portion and under intense light formed tightly entangled blobs. The worms appeared to divide responsibilities for the movement, with some individuals pulling the blob while others helped lift the aggregation to reduce friction.</p><p>As with evaporation, the collective activity improves the chances of survival for the entire group, which can range from 10 worms up to as many as 50,000.</p><p>&ldquo;For an individual worm going from hot to cold, survival depends on chance,&rdquo; said Bhamla. &ldquo;When they move as a blob, they move more slowly because they have to coordinate the mechanics. But if they move as a blob, 95% of them get to the cold side, so being part of the blob confers many survival advantages.&rdquo;</p><p><strong>A Worm Gymnasium</strong></p><p>The researchers noted that only two or three &ldquo;puller&rdquo; worms were needed to drag a 15-worm blob. That led them to wonder just how strong the creatures were, so Ozkan-Aydin created a series of poles and cantilevers in which she could measure the forces exerted by individual worms. This &ldquo;worm gymnasium&rdquo; allowed her to appreciate how the pullers managed to do their jobs.</p><p>&ldquo;When the worms are happy and cool, they stretch out and grab onto one of the poles with their heads and they pull onto it,&rdquo; Bhamla said. &ldquo;When they are pulling, you can see the deflection of the cantilever to which their tails were attached. Yasemin was able to use known weights to calibrate the forces the worms create. The force measurement shows the individual worms are packing a lot of power.&rdquo;</p><p>Some worms were stronger than others, and as the temperature increased, their willingness to work out at the gym declined.</p><p><strong>Applying Worm Principles to Robots</strong></p><p>Ozkan-Aydin also applied the principles observed in the worms to small robotic blobs composed of &ldquo;smart active particles,&rdquo; six 3D-printed robots with two arms and two sensors allowing them to sense light. She added a mesh enclosure and pins to arms that allowed these &ldquo;smarticles&rdquo; to be entangled like the worms and tested a variety of gaits and movements that could be programmed into them.</p><p>&ldquo;Depending on the intensity, the robots try to move away from the light,&rdquo; Ozkan-Aydin said. &ldquo;They generate emergent behavior that is similar to what we saw in the worms.&rdquo;</p><p>She noted that there was no communication among the robots. &ldquo;Each robot is doing its own thing in a decentralized way,&rdquo; she said. &ldquo;Using just the mechanical interaction and the attraction each robot had for light intensity, we could control the robot blob.&rdquo;</p><p>By measuring the energy consumption of an individual robot when it performed different gaits (wiggle and crawl), she determined that the wiggle gait uses less power than the crawl gait. The researchers anticipate that by exploiting gait differentiation, future entangled robotic swarms could improve their energy efficiency.&nbsp;</p><p><strong>Expanding What Robot Swarms Can Do</strong></p><p>The researchers hope to continue their study of the collective dynamics of the worm blobs and apply what they learn to swarm robots, which must work together with little communication to accomplish tasks that they could not do alone. But those systems must be able to work in the real world.</p><p>&ldquo;Often people want to make robot swarms do specific things, but they tend to be operating in pristine environments with simple situations,&rdquo; said Goldman. &ldquo;With these blobs, the whole point is that they work only because of physical interaction among the individuals. That&rsquo;s an interesting factor to bring into robotics.&rdquo;</p><p>Among the challenges ahead are recruiting graduate students willing to work with the worm blobs, which have the consistency of bread dough.&nbsp;</p><p>&ldquo;The worms are very nice to work with,&rdquo; said Ozkan-Aydin. &ldquo;We can play with them and they are very friendly. But it takes a person who is very comfortable working with living systems.&rdquo;</p><p>The project shows how the biological world can provide insights beneficial to the field of robotics, said Kathryn Dickson, program director of the Physiological Mechanisms and Biomechanics Program at the National Science Foundation.</p><p>&ldquo;This discovery shows that observations of animal behavior in natural settings, along with biological experiments and modeling, can offer new insights, and how new knowledge gained from interdisciplinary research can help humans, for example, in the robotic control applications arising from this work,&rdquo; she said.</p><p><em>This research was supported by the National Science Foundation (NSF) under grants CAREER 1941933 and 1817334 and the Army Research Office under grant W911NF-11-1-0514. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the sponsoring agencies.</em></p><p><strong>CITATION</strong>: Yasemin Ozkan-Aydin, Daniel I. Goldman, and M. Saad Bhamla, &ldquo;Collective dynamics in entangled worm and robot blobs. (<em>Proceedings of the National Academy of Sciences</em>, 2021). <a href="https://doi.org/10.1073/pnas.2010542118">https://doi.org/10.1073/pnas.2010542118</a></p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986) (jtoon@gatech.edu)</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1612979447</created>  <gmt_created>2021-02-10 17:50:47</gmt_created>  <changed>1612979549</changed>  <gmt_changed>2021-02-10 17:52:29</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Research into "blobs" formed by worms and robots could help developers of swarm robots better utilize emergent behavior.]]></teaser>  <type>news</type>  <sentence><![CDATA[Research into "blobs" formed by worms and robots could help developers of swarm robots better utilize emergent behavior.]]></sentence>  <summary><![CDATA[<p>Individually, California blackworms live an unremarkable life eating microorganisms in ponds and serving as tropical fish food for aquarium enthusiasts. But together, tens, hundreds, or thousands of the centimeter-long creatures can collaborate to form a &ldquo;worm blob,&rdquo; a shape-shifting living liquid that collectively protects its members from drying out and helps them escape threats such as excessive heat.</p>]]></summary>  <dateline>2021-02-10T00:00:00-05:00</dateline>  <iso_dateline>2021-02-10T00:00:00-05:00</iso_dateline>  <gmt_dateline>2021-02-10 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>644063</item>          <item>644064</item>          <item>644067</item>          <item>644069</item>          <item>644066</item>          <item>644071</item>          <item>644070</item>      </media>  <hg_media>          <item>          <nid>644063</nid>          <type>image</type>          <title><![CDATA[Worm blobs create collective behavior]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[worm-blobs_3202.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/worm-blobs_3202.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/worm-blobs_3202.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/worm-blobs_3202.jpg?itok=olEiaef3]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Blobs of California blackworms in bottles]]></image_alt>                    <created>1612977380</created>          <gmt_created>2021-02-10 17:16:20</gmt_created>          <changed>1612977380</changed>          <gmt_changed>2021-02-10 17:16:20</gmt_changed>      </item>          <item>          <nid>644064</nid>          <type>image</type>          <title><![CDATA[Closeup of smart active particle (smarticle)]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[smarticle_2917.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/smarticle_2917.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/smarticle_2917.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/smarticle_2917.jpg?itok=SXUH-1jl]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Closeup of robotic smarticles]]></image_alt>                    <created>1612977504</created>          <gmt_created>2021-02-10 17:18:24</gmt_created>          <changed>1612977504</changed>          <gmt_changed>2021-02-10 17:18:24</gmt_changed>      </item>          <item>          <nid>644067</nid>          <type>image</type>          <title><![CDATA[Group of smart active particles (smarticles)]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[smarticle-blob_2976.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/smarticle-blob_2976.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/smarticle-blob_2976.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/smarticle-blob_2976.jpg?itok=gYKbEx1X]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Group of smart active particles (smarticles)]]></image_alt>                    <created>1612977805</created>          <gmt_created>2021-02-10 17:23:25</gmt_created>          <changed>1612977805</changed>          <gmt_changed>2021-02-10 17:23:25</gmt_changed>      </item>          <item>          <nid>644069</nid>          <type>image</type>          <title><![CDATA[Daniel Goldman and smarticle]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[smarticle-goldman_3146.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/smarticle-goldman_3146.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/smarticle-goldman_3146.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/smarticle-goldman_3146.jpg?itok=-hh56Fj1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Dan Goldman holds smart active particle robot]]></image_alt>                    <created>1612977934</created>          <gmt_created>2021-02-10 17:25:34</gmt_created>          <changed>1612977934</changed>          <gmt_changed>2021-02-10 17:25:34</gmt_changed>      </item>          <item>          <nid>644066</nid>          <type>image</type>          <title><![CDATA[Robot blob and worm blob]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[smarticle-worm-blob_2963.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/smarticle-worm-blob_2963.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/smarticle-worm-blob_2963.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/smarticle-worm-blob_2963.jpg?itok=fSguWfWL]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Robot blob and worm blob compared]]></image_alt>                    <created>1612977688</created>          <gmt_created>2021-02-10 17:21:28</gmt_created>          <changed>1612977688</changed>          <gmt_changed>2021-02-10 17:21:28</gmt_changed>      </item>          <item>          <nid>644071</nid>          <type>image</type>          <title><![CDATA[Smarticles interact to form a robot blob]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[worm-blobs_2906.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/worm-blobs_2906.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/worm-blobs_2906.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/worm-blobs_2906.jpg?itok=kn43XWHy]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Smarticles interact to form a robot blob]]></image_alt>                    <created>1612978286</created>          <gmt_created>2021-02-10 17:31:26</gmt_created>          <changed>1612978286</changed>          <gmt_changed>2021-02-10 17:31:26</gmt_changed>      </item>          <item>          <nid>644070</nid>          <type>image</type>          <title><![CDATA[Living liquid of worm blobs]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[worm-blob_2971.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/worm-blob_2971.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/worm-blob_2971.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/worm-blob_2971.jpg?itok=8Z_ExQVx]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Worm blob flows from a hand]]></image_alt>                    <created>1612978167</created>          <gmt_created>2021-02-10 17:29:27</gmt_created>          <changed>1612978167</changed>          <gmt_changed>2021-02-10 17:29:27</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="126011"><![CDATA[School of Physics]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="186986"><![CDATA[worm blob]]></keyword>          <keyword tid="182389"><![CDATA[smarticle]]></keyword>          <keyword tid="186987"><![CDATA[robot blob]]></keyword>          <keyword tid="181005"><![CDATA[collective behavior]]></keyword>          <keyword tid="175602"><![CDATA[living systems]]></keyword>          <keyword tid="186555"><![CDATA[active matter]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71911"><![CDATA[Earth and Environment]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="642447">  <title><![CDATA[Spontaneous Robot Dances Highlight a New Kind of Order in Active Matter]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Predicting when and how collections of particles, robots, or animals become orderly remains a challenge across science and engineering.</p><p>In the 19th century, scientists and engineers developed the discipline of statistical mechanics, which predicts how groups of simple particles transition between order and disorder, as when a collection of randomly colliding atoms freezes to form a uniform crystal lattice.</p><p>More challenging to predict are the collective behaviors that can be achieved when the particles become more complicated, such that they can move under their own power. This type of system &mdash; observed in bird flocks, bacterial colonies, and robot swarms &mdash; goes by the name &quot;active matter.&rdquo;</p><p>As reported in the January 1, 2021 issue of the journal <em>Science</em>, a team of physicists and engineers have proposed a new principle by which active matter systems can spontaneously order, without need for higher level instructions or even programmed interaction among the agents. And they have demonstrated this principle in a variety of systems, including groups of periodically shape-changing robots called &quot;smarticles&quot; &mdash; smart, active particles.</p><p>The theory, developed by Postdoctoral Researcher Pavel Chvykov at the Massachusetts Institute of Technology while a student of Prof. Jeremy England, who is now a researcher in the <a href="http://www.physics.gatech.edu">School of Physics </a>at Georgia Institute of Technology, posits that certain types of active matter with sufficiently messy dynamics will spontaneously find what the researchers refer to as &quot;low rattling&quot; states.</p><p>&ldquo;Rattling is when matter takes energy flowing into it and turns it into random motion,&rdquo; England said. &ldquo;Rattling can be greater either when the motion is more violent, or more random. Conversely, low rattling is either very slight or highly organized &mdash; or both. So, the idea is that if your matter and energy source allow for the possibility of a low rattling state, the system will randomly rearrange until it finds that state and then gets stuck there. If you supply energy through forces with a particular pattern, this means the selected state will discover a way for the matter to move that finely matches that pattern.&rdquo;</p><p>To develop their theory, England and Chvykov took inspiration from a phenomenon &mdash; dubbed thermophoresis &mdash; discovered by the Swiss physicist Charles Soret in the late 19th century. In Soret&#39;s experiments, he discovered that subjecting an initially uniform salt solution in a tube to a difference in temperature would spontaneously lead to an increase in salt concentration in the colder region &mdash; which corresponds to an increase in order of the solution.&nbsp;</p><p>Chvykov and England developed numerous mathematical models to demonstrate the low rattling principle, but it wasn&#39;t until they connected with <a href="https://physics.gatech.edu/user/daniel-goldman">Daniel Goldman</a>, Dunn Family Professor of Physics at the Georgia Institute of Technology, that they were able to test their predictions.&nbsp;</p><p>Said Goldman, &quot;A few years back, I saw England give a seminar and thought that some of our smarticle robots might prove valuable to test this theory.&quot; Working with Chvykov, who visited Goldman&#39;s lab, Ph.D. students William Savoie and Akash Vardhan used three flapping smarticles enclosed in a ring to compare experiments to theory. The students observed that instead of displaying complicated dynamics and exploring the container completely, the robots would spontaneously self-organize into a few dances &mdash; for example, one dance consists of three robots slapping each other&#39;s arms in sequence. These dances could persist for hundreds of flaps, but suddenly lose stability and be replaced by a dance of a different pattern.</p><p>After first demonstrating that these simple dances were indeed low rattling states, Chvykov worked with engineers at Northwestern University, Prof. Todd Murphey and Ph.D. student Thomas Berrueta, who developed more refined and better controlled smarticles. The improved smarticles allowed the researchers to test the limits of the theory, including how the types and number of dances varied for different arm flapping patterns, as well as how these dances could be controlled. &quot;By controlling sequences of low rattling states, we were able to make the system reach configurations that do useful work,&quot; Berrueta said. The Northwestern University researchers say that these findings may have broad practical implications for micro-robotic swarms, active matter, and metamaterials.</p><p>As England noted: &ldquo;For robot swarms, it&rsquo;s about getting many adaptive and smart group behaviors that you can design to be realized in a single swarm, even though the individual robots are relatively cheap and computationally simple. For living cells and novel materials, it might be about understanding what the &lsquo;swarm&rsquo; of atoms or proteins can get you, as far as new material or computational properties.&rdquo;</p><p>The study&rsquo;s Georgia Tech-based team includes Jeremy L. England, a Physics of Living Systems scientist who researches with the School of Physics; Dunn Family Professor Daniel Goldman; professor Kurt Wiesenfeld, and graduate students Akash Vardhan (Quantitative Biosciences) and William Savoie (School of Physics). They join Pavel Chvykov (Massachusetts Institute of Technology), along with professor Todd D. Murphey and graduate students Thomas A. Berrueta and Alexander Samland of Northwestern University.</p><p>This material is based on work supported by the Army Research Office under awards from ARO W911NF-18-1-0101, ARO MURI Award W911NF-19-1-0233, ARO W911NF-13-1-0347, by the National Science Foundation under grants PoLS-0957659, PHY-1205878, PHY-1205878, PHY-1205878, and DMR-1551095, NSF CBET-1637764, by the James S. McDonnell Foundation Scholar Grant 220020476, and the Georgia Institute of Technology Dunn Family Professorship. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the sponsoring agencies.</p><p><strong>CITATION</strong>: Chvykov &amp; Berrueta, et al., &ldquo;Low rattling: A predictive principle for self-organization in active collectives,&rdquo; (Science 2021).&nbsp;<a href="https://science.sciencemag.org/content/371/6524/90/tab-pdf">https://science.sciencemag.org/content/371/6524/90/tab-pdf</a></p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1609442532</created>  <gmt_created>2020-12-31 19:22:12</gmt_created>  <changed>1609442995</changed>  <gmt_changed>2020-12-31 19:29:55</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have proposed a new principle by which active matter systems can spontaneously order, without need for higher level instructions.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have proposed a new principle by which active matter systems can spontaneously order, without need for higher level instructions.]]></sentence>  <summary><![CDATA[<p>Researchers have proposed a new principle by which active matter systems can spontaneously order, without need for higher level instructions or even programmed interaction among the agents. And they have demonstrated this principle in a variety of systems, including groups of periodically shape-changing robots called &quot;smarticles.&quot;</p>]]></summary>  <dateline>2020-12-31T00:00:00-05:00</dateline>  <iso_dateline>2020-12-31T00:00:00-05:00</iso_dateline>  <gmt_dateline>2020-12-31 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>642445</item>          <item>642446</item>      </media>  <hg_media>          <item>          <nid>642445</nid>          <type>image</type>          <title><![CDATA[Swarm of smarticles]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[angle1.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/angle1.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/angle1.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/angle1.png?itok=lY9EpBgf]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Smarticles in a ring]]></image_alt>                    <created>1609442091</created>          <gmt_created>2020-12-31 19:14:51</gmt_created>          <changed>1609442091</changed>          <gmt_changed>2020-12-31 19:14:51</gmt_changed>      </item>          <item>          <nid>642446</nid>          <type>image</type>          <title><![CDATA[Possible smarticle shapes]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[composite2.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/composite2.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/composite2.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/composite2.png?itok=NdOMxD2s]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Composite smarticle shapes]]></image_alt>                    <created>1609442220</created>          <gmt_created>2020-12-31 19:17:00</gmt_created>          <changed>1609442220</changed>          <gmt_changed>2020-12-31 19:17:00</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="126011"><![CDATA[School of Physics]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="182389"><![CDATA[smarticle]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="186555"><![CDATA[active matter]]></keyword>          <keyword tid="186556"><![CDATA[order]]></keyword>          <keyword tid="47881"><![CDATA[Dan Goldman]]></keyword>      </keywords>  <core_research_areas>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="639322">  <title><![CDATA[Extending Origami Into Untethered Robots and Morphing Devices]]></title>  <uid>27303</uid>  <body><![CDATA[<p>A team of researchers from The Ohio State University and the Georgia Institute of Technology has extended the possibility of origami, the ancient art of paper folding, for modern engineering applications such as untethered robotics and morphing devices.&nbsp;</p><p>The researchers demonstrated for the first time a multifunctional, magnetically responsive origami system, possessing distributed, untethered control capabilities. The untethered magnetic actuation separates the power source and controller out of the system, allowing scalable applications.</p><p>Researchers foresee that this actuation solution can be applied locally and remotely on complex origami assemblies. The actuation strategy enables a myriad of new applications, ranging from morphing robotics and satellites to biomedical devices.</p><p>&ldquo;By distributively integrating the programmed magnetic soft materials into the bi-stable origami assembly, the magnetic actuation provides independent control of the folding and unfolding of each unit cell with instantaneous shape locking, which enables various robotic motion for functions such as tunable physical properties and configurable electronics for digital computing,&rdquo; said principal investigator Ruike (Renee) Zhao, an assistant professor in the Department of Mechanical and Aerospace Engineering at Ohio State.</p><p>The research, &quot;Untethered control of functional origami microrobots with distributed actuation,&quot; was reported Sept. 14 in the journal <em>Proceedings of the National Academy of Sciences</em>. The work was sponsored by the National Science Foundation (NSF).</p><p>Researchers have explored for decades how to leverage origami folding techniques in advanced engineering applications, such as morphing structures and devices. However, most actuation methods require physical bonds to external stimuli and lead to excessive wiring to provide the driving force for origami folding.</p><p>The new, untethered system is free from those rigid and often relatively bulky power sources, allowing faster speed and distributed actuation of the multifunctional structure.</p><p>To demonstrate this, researchers constructed a system of magnetic-responsive materials in a cylindrical origami pattern that consists of identical triangular panels known as a Kresling pattern. This pattern allows the cylinder&rsquo;s walls to buckle under axial or torsional load.</p><p>&ldquo;The Kresling pattern offers a very rich design space, which was crucial in coupling its mechanical response with magnetically responsive materials to achieve on-demand, untethered actuation, including our multifunctional origami for digital computing,&rdquo; said <a href="https://cee.gatech.edu/people/Faculty/6709/overview">Glaucio Paulino</a>, professor and Raymond Allen Jones Chair in the Georgia Tech <a href="http://www.cee.gatech.edu">School of Civil and Environmental Engineering</a>.</p><p>By controlling the magnetic field, researchers were able to control the direction, intensity, and speed of the material&rsquo;s folding and deployment. In the tests, researchers achieved untethered actuation as fast as one tenth of a second with instantaneous shape locking.</p><p>Next, researchers attached a magnetized plate to each of the Kresling unit cells. This allowed them to utilize a two-dimensional magnetic field to actuate the unit cells simultaneously or independently by using different magnetic torques of the plates and distinct geometric-mechanical properties of each unit cell.</p><p>&ldquo;The multi-unit Kresling assembly is an origami robot in which the bi-stable folding and unfolding create robotic motion. It can passively sense and actively respond to the external environment. By integrating electronic circuits into the origami robot, it further enables intelligent autonomous robots with integrated actuation, sensing, and decision making,&rdquo; Zhao said. &ldquo;For example, the external pressure or forces that act on the robot will trigger the passive folding of the robot, indicating the presence of an obstacle. The robot can then actively unfold itself and decide the next move.&rdquo;</p><p>The untethered magnetic control pushes the boundary of the application of origami systems, which could lead to solutions of next-generation biomimetic soft robots and robotic systems for advanced engineering applications.</p><p>&ldquo;We anticipate that the reported magnetic origami system is applicable beyond the bounds of this work, including future origami-inspired robots, morphing mechanisms, biomedical devices, and outer space structures,&rdquo; Paulino said.&nbsp;</p><p>This research was supported by Prof. Zhao&rsquo;s two recent NSF Awards from the Mechanics of Materials and Structures program (NSF Award #1943070, #1939543) and Ohio State&rsquo;s Institute of Material Research. The authors at Georgia Tech acknowledge NSF (Award #1538830) and the Raymond Allen Jones Chair. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.</p><p>- Written by The Ohio State University</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986) (jtoon@gatech.edu).</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1600695020</created>  <gmt_created>2020-09-21 13:30:20</gmt_created>  <changed>1600695721</changed>  <gmt_changed>2020-09-21 13:42:01</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have extended the possibility of origami for modern engineering applications such as untethered robotics and morphing devices. ]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have extended the possibility of origami for modern engineering applications such as untethered robotics and morphing devices. ]]></sentence>  <summary><![CDATA[<p>A team of researchers from The Ohio State University and the Georgia Institute of Technology has extended the possibility of origami, the ancient art of paper folding, for modern engineering applications such as untethered robotics and morphing devices.&nbsp;</p>]]></summary>  <dateline>2020-09-21T00:00:00-04:00</dateline>  <iso_dateline>2020-09-21T00:00:00-04:00</iso_dateline>  <gmt_dateline>2020-09-21 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>639320</item>          <item>639321</item>      </media>  <hg_media>          <item>          <nid>639320</nid>          <type>image</type>          <title><![CDATA[Extending Origami]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[origami-robot2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/origami-robot2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/origami-robot2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/origami-robot2.jpg?itok=n69YdQJ2]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Origami-based robots]]></image_alt>                    <created>1600694572</created>          <gmt_created>2020-09-21 13:22:52</gmt_created>          <changed>1600694572</changed>          <gmt_changed>2020-09-21 13:22:52</gmt_changed>      </item>          <item>          <nid>639321</nid>          <type>image</type>          <title><![CDATA[Extending Origami - 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[origami-robot.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/origami-robot.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/origami-robot.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/origami-robot.jpg?itok=Gw-M2QlJ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Origami robot]]></image_alt>                    <created>1600694640</created>          <gmt_created>2020-09-21 13:24:00</gmt_created>          <changed>1600694640</changed>          <gmt_changed>2020-09-21 13:24:00</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="4332"><![CDATA[origami]]></keyword>          <keyword tid="185892"><![CDATA[origami robotics]]></keyword>          <keyword tid="185893"><![CDATA[morphing devices]]></keyword>          <keyword tid="185894"><![CDATA[magnetically responsive]]></keyword>      </keywords>  <core_research_areas>          <term tid="39471"><![CDATA[Materials]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="636291">  <title><![CDATA[‘SlothBot in the Garden’ Demonstrates Hyper-Efficient Conservation Robot]]></title>  <uid>27303</uid>  <body><![CDATA[<p>For the next several months, visitors to the <a href="https://atlantabg.org/">Atlanta Botanical Garden</a> will be able to observe the testing of a new high-tech tool in the battle to save some of the world&rsquo;s most endangered species. SlothBot, a slow-moving and energy-efficient robot that can linger in the trees to monitor animals, plants, and the environment below, will be tested near the Garden&rsquo;s popular Canopy Walk.</p><p>Built by robotics engineers at the Georgia Institute of Technology to take advantage of the low-energy lifestyle of real sloths, SlothBot demonstrates how being slow can be ideal for certain applications. Powered by solar panels and using innovative power management technology, SlothBot moves along a cable strung between two large trees as it monitors temperature, weather, carbon dioxide levels, and other information in the Garden&rsquo;s 30-acre midtown Atlanta forest.</p><p>&ldquo;SlothBot embraces slowness as a design principle,&rdquo; said <a href="https://www.ece.gatech.edu/faculty-staff-directory/magnus-egerstedt-0">Magnus Egerstedt</a>, professor and Steve W. Chaddick School Chair in the Georgia Tech <a href="http://www.ece.gatech.edu">School of Electrical and Computer Engineering</a>. &ldquo;That&rsquo;s not how robots are typically designed today, but being slow and hyper-energy efficient will allow SlothBot to linger in the environment to observe things we can only see by being present continuously for months, or even years.&rdquo;</p><p>About three feet long, SlothBot&rsquo;s whimsical 3D-printed shell helps protect its motors, gearing, batteries, and sensing equipment from the weather. The robot is programmed to move only when necessary, and will locate sunlight when its batteries need recharging. At the Atlanta Botanical Garden, SlothBot will operate on a single 100-foot cable, but in larger environmental applications, it will be able to switch from cable to cable to cover more territory.</p><p>&ldquo;The most exciting goal we&rsquo;ll demonstrate with SlothBot is the union of robotics and technology with conservation,&rdquo; said <a href="https://atlantabg.org/article/emily-e-d-coffey-ph-d/">Emily Coffey</a>, vice president for conservation and research at the Garden. &ldquo;We do conservation research on imperiled plants and ecosystems around the world, and SlothBot will help us find new and exciting ways to advance our research and conservation goals.&rdquo;</p><p>Supported by the National Science Foundation and the Office of Naval Research, SlothBot could help scientists better understand the abiotic factors affecting critical ecosystems, providing a new tool for developing information needed to protect rare species and endangered ecosystems.</p><p>&ldquo;SlothBot could do some of our research remotely and help us understand what&rsquo;s happening with pollinators, interactions between plants and animals, and other phenomena that are difficult to observe otherwise,&rdquo; Coffey added. &ldquo;With the rapid loss of biodiversity and with more than a quarter of the world&rsquo;s plants potentially heading toward extinction, SlothBot offers us another way to work toward conserving those species.&rdquo;</p><p>Inspiration for the robot came from a visit Egerstedt made to a vineyard in Costa Rica where he saw two-toed sloths creeping along overhead wires in their search for food in the tree canopy. &ldquo;It turns out that they were strategically slow, which is what we need if we want to deploy robots for long periods of time,&rdquo; he said.</p><p>A few other robotic systems have already demonstrated the value of slowness. Among the best known are the Mars Exploration Rovers that gathered information on the red planet for more than a dozen years. &ldquo;Speed wasn&rsquo;t really all that important to the Mars Rovers,&rdquo; Egerstedt noted. &ldquo;But they learned a lot during their leisurely exploration of the planet.&rdquo;</p><p>Beyond conservation, SlothBot could have applications for precision agriculture, where the robot&rsquo;s camera and other sensors traveling in overhead wires could provide early detection of crop diseases, measure humidity, and watch for insect infestation. After testing in the Atlanta Botanical Garden, the researchers hope to move SlothBot to South America to observe orchid pollination or the lives of endangered frogs.</p><p>The research team, which includes Ph.D students Gennaro Notomista and Yousef Emam, undergraduate student Amy Yao, and postdoctoral researcher Sean Wilson, considered multiple locomotion techniques for the SlothBot. Wheeled robots are common, but in the natural world they can easily be defeated by obstacles like rocks or mud. Flying robots require too much energy to linger for long. That&rsquo;s why Egerstedt&rsquo;s observation of the wire-crawling sloths was so important.</p><p>&ldquo;It&rsquo;s really fascinating to think about robots becoming part of the environment, a member of an ecosystem,&rdquo; he said. &ldquo;While we&rsquo;re not building an anatomical replica of the living sloth, we believe our robot can be integrated to be part of the ecosystem it&rsquo;s observing like a real sloth.&rdquo;</p><p>The SlothBot launched in the Atlanta Botanical Garden is the second version of a system originally reported in May 2019 at the International Conference on Robotics and Automation. That robot was a much smaller laboratory prototype.</p><p>Beyond their conservation goals, the researchers hope SlothBot will provide a new way to stimulate interest in conservation from the Garden&rsquo;s visitors. &ldquo;This will help us tell the story of the merger between technology and conservation,&rdquo; Coffey said. &ldquo;It&rsquo;s a unique way to engage the public and bring forward a new way to tell our story.&rdquo;</p><p>And that should be especially interesting to children visiting the Garden.</p><p>&ldquo;This new way of thinking about robots should trigger curiosity among the kids who will walk by it,&rdquo; said Egerstedt. &ldquo;Thanks to SlothBot, I&rsquo;m hoping we will get an entirely new generation interested in what robotics can do to make the world better.&rdquo;</p><p><em>This research was sponsored by the U.S. Office of Naval Research through Grant N00014-15-2115 and by the National Science Foundation through Grant 1531195. The content is solely the responsibility of the authors and does not necessarily represent the official views of the sponsoring agencies.</em></p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contacts</strong>: John Toon, Georgia Tech (404-894-6986) (jtoon@gatech.edu); Danny Flanders, Atlanta Botanical Garden (404-591-1550) (dflanders@atlantabg.org).</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1592360291</created>  <gmt_created>2020-06-17 02:18:11</gmt_created>  <changed>1592360376</changed>  <gmt_changed>2020-06-17 02:19:36</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Visitors to the Atlanta Botanical Garden can observe the testing of SlothBot, a new high-tech tool in the battle to save some of the world’s most endangered species.]]></teaser>  <type>news</type>  <sentence><![CDATA[Visitors to the Atlanta Botanical Garden can observe the testing of SlothBot, a new high-tech tool in the battle to save some of the world’s most endangered species.]]></sentence>  <summary><![CDATA[<p>For the next several months, visitors to the Atlanta Botanical Garden will be able to observe the testing of a new high-tech tool in the battle to save some of the world&rsquo;s most endangered species. SlothBot, a slow-moving and energy-efficient robot that can linger in the trees to monitor animals, plants, and the environment below, will be tested near the Garden&rsquo;s popular Canopy Walk.</p>]]></summary>  <dateline>2020-06-16T00:00:00-04:00</dateline>  <iso_dateline>2020-06-16T00:00:00-04:00</iso_dateline>  <gmt_dateline>2020-06-16 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>636285</item>          <item>636284</item>          <item>636283</item>          <item>636287</item>          <item>636288</item>          <item>636289</item>      </media>  <hg_media>          <item>          <nid>636285</nid>          <type>image</type>          <title><![CDATA[SlothBot operating in Atlanta Botanical Garden - 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[slothbot-16.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/slothbot-16.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/slothbot-16.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/slothbot-16.jpg?itok=uHP47Pbr]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[SlothBot at Atlanta Botanical Garden]]></image_alt>                    <created>1592358753</created>          <gmt_created>2020-06-17 01:52:33</gmt_created>          <changed>1592358753</changed>          <gmt_changed>2020-06-17 01:52:33</gmt_changed>      </item>          <item>          <nid>636284</nid>          <type>image</type>          <title><![CDATA[SlothBot research team at Atlanta Botanical Garden]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[slothbot-08.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/slothbot-08.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/slothbot-08.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/slothbot-08.jpg?itok=_j1By8pJ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[SlothBot research team]]></image_alt>                    <created>1592358513</created>          <gmt_created>2020-06-17 01:48:33</gmt_created>          <changed>1592358803</changed>          <gmt_changed>2020-06-17 01:53:23</gmt_changed>      </item>          <item>          <nid>636283</nid>          <type>image</type>          <title><![CDATA[SlothBot operating in Atlanta Botanical Garden]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[slothbot-18.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/slothbot-18.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/slothbot-18.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/slothbot-18.jpg?itok=4Oenlq6R]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[SlothBot at Atlanta Botanical Garden]]></image_alt>                    <created>1592358388</created>          <gmt_created>2020-06-17 01:46:28</gmt_created>          <changed>1592358388</changed>          <gmt_changed>2020-06-17 01:46:28</gmt_changed>      </item>          <item>          <nid>636287</nid>          <type>image</type>          <title><![CDATA[Georgia Tech - Atlanta Botanical Garden Collaboration]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[slothbot-11.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/slothbot-11_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/slothbot-11_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/slothbot-11_0.jpg?itok=dBoHSmcM]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Magnus Egersted and Emily Coffey]]></image_alt>                    <created>1592359063</created>          <gmt_created>2020-06-17 01:57:43</gmt_created>          <changed>1592359149</changed>          <gmt_changed>2020-06-17 01:59:09</gmt_changed>      </item>          <item>          <nid>636288</nid>          <type>image</type>          <title><![CDATA[Magnus Egerstedt and SlothBot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[slothbot-14.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/slothbot-14.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/slothbot-14.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/slothbot-14.jpg?itok=Idwf3TAe]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Magnus Egerstedt with SlothBot]]></image_alt>                    <created>1592359280</created>          <gmt_created>2020-06-17 02:01:20</gmt_created>          <changed>1592359280</changed>          <gmt_changed>2020-06-17 02:01:20</gmt_changed>      </item>          <item>          <nid>636289</nid>          <type>image</type>          <title><![CDATA[SlothBot in the Lab]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[slothbot_3044.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/slothbot_3044.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/slothbot_3044.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/slothbot_3044.jpg?itok=53UIzS3V]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[SlothBot researchers in the lab]]></image_alt>                    <created>1592359383</created>          <gmt_created>2020-06-17 02:03:03</gmt_created>          <changed>1592359383</changed>          <gmt_changed>2020-06-17 02:03:03</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71911"><![CDATA[Earth and Environment]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="635512">  <title><![CDATA[People Think Robots Are Pretty Incompetent and Not Funny, New Study Says]]></title>  <uid>31759</uid>  <body><![CDATA[<p>Dang robots are crummy at so many jobs, and they tell lousy jokes to boot. In two new studies, these were common biases human participants held toward&nbsp;robots.</p><p>The studies were originally intended to test for gender bias, that is, if people thought a robot believed to be female may be less competent at some jobs than a robot believed to be male and vice versa. The studies&#39; titles even included the words &quot;gender,&quot; &quot;stereotypes,&quot; and &quot;preference,&quot; but researchers at the Georgia Institute of Technology discovered no significant sexism against the machines.</p><p>&ldquo;This did surprise us. There was only a very slight difference in a couple of jobs but not significant. There was, for example, a small preference for a male robot over a female robot as a package deliverer,&rdquo; said Ayanna Howard, the principal investigator in both studies. Howard is a&nbsp;<a href="https://www.ic.gatech.edu/people/ayanna-howard" target="_blank">professor in and the chair of Georgia Tech&rsquo;s School of Interactive Computing</a>.</p><p>Although robots are not sentient, as people increasingly interface with them, we begin to humanize the machines. Howard studies what goes right as we integrate robots into society and what goes wrong, and much of both has to do with how the humans feel around robots.</p><h3><strong>I hate robots</strong></h3><p>&ldquo;Surveillance robots are not socially engaging, but when we see them, we still may act like we would when we see a police officer, maybe not jaywalking and being very conscientious of our behavior,&rdquo; said Howard, who is also&nbsp;<a href="https://www.ece.gatech.edu/faculty-staff-directory/ayanna-maccalla-howard" target="_blank">Linda J. and Mark C. Smith Chair and Professor in Bioengineering in Georgia Tech&rsquo;s School of Electrical and Computer Engineering</a>.</p><p>&ldquo;Then there are emotionally engaging robots designed to tap into our feelings and work with our behavior. If you look at these examples, they lead us to treat these robots as if they were fellow intelligent beings.&rdquo;</p><p>It&rsquo;s a good thing robots don&rsquo;t have feelings because what study participants lacked in gender bias they more than made up for in judgments against the humanoid robots&#39; competence. That predisposition was so strong that Howard wondered if it may have overridden any potential gender biases against robots &ndash; after all, social science studies have shown that gender biases are still prevalent with respect to human jobs, even if implicit.</p><p>In questionnaires, humanoid robots introduced themselves via video to randomly recruited online survey respondents, who ranged in age from their twenties to their seventies and were mostly college-educated. The humans ranked robots&rsquo; career competencies compared to human abilities, only trusting the machines to competently perform a handful of simple jobs.&nbsp;</p><h3><strong>Pass the scalpel</strong></h3><p>&ldquo;The results baffled us because the things that people thought robots were less able to do were things that they do well. One was the profession of surgeon. There are&nbsp;<a href="https://www.davincisurgery.com/procedures/gynecology-surgery" target="_blank">Da Vinci robots that are pervasive in surgical suites</a>, but respondents didn&rsquo;t think robots were competent enough,&rdquo; Howard said. &ldquo;Security guard &ndash; people didn&rsquo;t think robots were competent at that, and there are companies that specialize in great robot security.&rdquo;</p><p>Cumulatively, the 200 participants across the two studies thought robots would also fail as nannies, therapists, nurses, firefighters, and totally bomb as comedians. But they felt confident bots would make fantastic package deliverers and receptionists, pretty good servers, and solid tour guides.</p><p>The researchers could not say where the competence biases originate. Howard could only speculate that some of the bad rap may have come from media stories of robots doing things like falling into swimming pools or injuring people.</p><h3><strong>It&rsquo;s a boy</strong>&nbsp;</h3><p>Despite the lack of gender bias, participants readily assigned genders to the humanoid robots. For example, people accepted gender prompts by robots introducing themselves in videos.</p><p>If a robot said, &ldquo;Hi, my name is James,&rdquo; in a male-sounding voice, people mostly identified the robot as male. If it said, &ldquo;Hi, my name is Mary,&rdquo; in a female voice, people mostly said it was female.</p><p>Some robots greeted people by saying &ldquo;Hi&rdquo; in a neutral sounding voice, and still, most participants assigned the robot a gender. The most common choice was male followed by neutral then by female. For Howard, this was an important takeaway from the study for robot developers.</p><p>&ldquo;Developers should not force gender on robots. People are going to gender according to their own experiences. Give the user that right. Don&rsquo;t reinforce gender stereotypes,&rdquo; Howard said.</p><h3><strong>Social is good</strong></h3><p>Some in the&nbsp;field advocate for not building robots in humanoid form at all in order to discourage any kind of&nbsp;humanization, but the Georgia Tech team takes a less stringent approach.</p><p>&quot;There is no single one-size-fits-all answer on whether it is appropriate to design robots to look like human beings.&nbsp; It depends on a variety of ethical considerations and other factors, including whether people might trust a robot too much&nbsp;if it has a human-like appearance,&quot; said Jason Borenstein, a co-principal investigator on one of the papers and an ethics&nbsp;<a href="https://spp.gatech.edu/people/person/jason-borenstein" target="_blank">researcher in Georgia Tech&#39;s School of Public Policy</a>.</p><p>&ldquo;Robots can be good for social interaction. They could be very helpful in elder care facilities to keep people company. They might also make better nannies than letting the TV babysit the kids,&rdquo; said Howard, who also defended robots&rsquo; comedic talent, provided they are programmed for that.</p><p>&ldquo;If you ever go to an amusement park, there are animatronics that tell really good jokes.&rdquo;</p><h3><strong>Read the studies</strong></h3><p>The two studies were submitted to conferences that were canceled due to COVID-19.</p><p>Why Should We Gender? The Effect of Robot Gendering and Occupational Stereotypes on Human Trust and Perceived Competency was published in&nbsp;<a href="https://doi.org/10.1145/3319502.3374778" target="_blank"><em>Proceedings of 2020 ACM Conference on Human-Robot Interaction (HRI&rsquo;20)</em></a>, which appeared in March 2020. Robot Gendering: Influences on Trust, Occupational Competency, and Preference of Robot Over Human appeared in&nbsp;<em>CHI 2020 Extended Abstracts&nbsp;</em>(computer-human interaction, DOI: 10.1145/3334480.3382930).</p><p>The research was funded by the National Science Foundation and by the Alfred P. Sloan Foundation.</p><p><em>The papers&rsquo; coauthors were De&rsquo;Aira Bryant, Kantwon Rogers, and Jason Borenstein from Georgia Tech. The National Science foundation funded via grant 1849101. The Alfred P. Sloan Foundation funded via grant G-2019-11435. Any findings, conclusions, or recommendations are those of the authors and not necessarily of the sponsors.</em></p><p><strong>Also read: <a href="https://rh.gatech.edu/news/635143/surfaces-grip-gecko-feet-could-be-easily-mass-produced" target="_blank">Surfaces that grip like gecko feet may come to an assembly line near you</a></strong></p><p><strong>Here&#39;s how to&nbsp;<a href="https://rh.gatech.edu/subscribe" target="_blank">subscribe to our free science and technology email&nbsp;newsletter</a></strong></p><p><strong>Writer &amp; media inquiries</strong>: Ben Brumfield (404-272-2780), email:&nbsp;<a href="mailto:ben.brumfield@comm.gatech.edu">ben.brumfield@comm.gatech.edu</a></p><p><strong>Georgia Institute of Technology</strong></p>]]></body>  <author>Ben Brumfield</author>  <status>1</status>  <created>1589911110</created>  <gmt_created>2020-05-19 17:58:30</gmt_created>  <changed>1590671873</changed>  <gmt_changed>2020-05-28 13:17:53</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Good thing humanoid robots don't have feelings because people think they are pretty incompetent.]]></teaser>  <type>news</type>  <sentence><![CDATA[Good thing humanoid robots don't have feelings because people think they are pretty incompetent.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2020-05-19T00:00:00-04:00</dateline>  <iso_dateline>2020-05-19T00:00:00-04:00</iso_dateline>  <gmt_dateline>2020-05-19 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>635511</item>          <item>635506</item>          <item>635507</item>      </media>  <hg_media>          <item>          <nid>635511</nid>          <type>image</type>          <title><![CDATA[Incompetent robots not funny]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[robot head.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/robot%20head.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/robot%20head.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/robot%2520head.jpg?itok=6tJVtPWh]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1589910479</created>          <gmt_created>2020-05-19 17:47:59</gmt_created>          <changed>1589910479</changed>          <gmt_changed>2020-05-19 17:47:59</gmt_changed>      </item>          <item>          <nid>635506</nid>          <type>image</type>          <title><![CDATA[Humanoid robots say hi]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Robot intros.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Robot%20intros.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Robot%20intros.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Robot%2520intros.jpg?itok=d50-9gQk]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1589909850</created>          <gmt_created>2020-05-19 17:37:30</gmt_created>          <changed>1589909850</changed>          <gmt_changed>2020-05-19 17:37:30</gmt_changed>      </item>          <item>          <nid>635507</nid>          <type>image</type>          <title><![CDATA[Ayanna Howard with humanoid robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[corobots_robot_howard.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/corobots_robot_howard.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/corobots_robot_howard.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/corobots_robot_howard.jpg?itok=WefkR2nU]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1589910140</created>          <gmt_created>2020-05-19 17:42:20</gmt_created>          <changed>1589910140</changed>          <gmt_changed>2020-05-19 17:42:20</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="151"><![CDATA[Policy, Social Sciences, and Liberal Arts]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="151"><![CDATA[Policy, Social Sciences, and Liberal Arts]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="169956"><![CDATA[robot-human interaction]]></keyword>          <keyword tid="86991"><![CDATA[gender bias]]></keyword>          <keyword tid="184850"><![CDATA[no gender bias]]></keyword>          <keyword tid="184851"><![CDATA[lack of gender bias]]></keyword>          <keyword tid="184849"><![CDATA[competency]]></keyword>          <keyword tid="184852"><![CDATA[stereotype]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>          <topic tid="71901"><![CDATA[Society and Culture]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="635326">  <title><![CDATA[Planetary Exploration Rover Avoids Sand Traps with “Rear Rotator Pedaling”]]></title>  <uid>27303</uid>  <body><![CDATA[<p>The rolling hills of Mars or the moon are a long way from the nearest tow truck. That&rsquo;s why the next generation of exploration rovers will need to be good at climbing hills covered with loose material and avoiding entrapment on soft granular surfaces.</p><p>Built with wheeled appendages that can be lifted and wheels able to wiggle,&nbsp;a new robot known as the &ldquo;Mini Rover&rdquo; has developed and tested complex locomotion techniques robust enough to help it climb hills covered with such granular material &ndash; and avoid the risk of getting ignominiously stuck on some remote planet or moon.&nbsp;</p><p>Using a complex move the researchers dubbed &ldquo;rear rotator pedaling,&rdquo; the robot can climb a slope by using its unique design to combine paddling, walking, and wheel spinning motions. The rover&rsquo;s behaviors were modeled using a branch of physics known as terradynamics.</p><p>&ldquo;When loose materials flow, that can create problems for robots moving across it,&rdquo; said <a href="https://physics.gatech.edu/user/daniel-goldman">Dan Goldman</a>, the Dunn Family Professor in the <a href="http://www.physics.gatech.edu">School of Physics</a> at the Georgia Institute of Technology. &ldquo;This rover has enough degrees of freedom that it can get out of jams pretty effectively. By avalanching materials from the front wheels, it creates a localized fluid hill for the back wheels that is not as steep as the real slope. The rover is always self-generating and self-organizing a good hill for itself.&rdquo;</p><p>The research was reported on May 13 as the cover article in the journal <em>Science Robotics</em>. The work was supported by the NASA National Robotics Initiative and the Army Research Office.</p><p>A robot built by NASA&rsquo;s Johnson Space Center pioneered the ability to spin its wheels, sweep the surface with those wheels and lift each of its wheeled appendages where necessary, creating a broad range of potential motions. Using in-house 3D printers, the Georgia Tech researchers collaborated with the Johnson Space Center to re-create those capabilities in a scaled-down vehicle with four wheeled appendages driven by 12 different motors.</p><p>&ldquo;The rover was developed with a modular mechatronic architecture, commercially available components, and a minimal number of parts,&rdquo; said Siddharth Shrivastava, an undergraduate student in Georgia Tech&rsquo;s <a href="http://www.me.gatech.edu">George W. Woodruff School of Mechanical Engineering</a>. &ldquo;This enabled our team to use our robot as a robust laboratory tool and focus our efforts on exploring creative and interesting experiments without worrying about damaging the rover, service downtime, or hitting performance limitations.&rdquo;&nbsp;</p><p>The rover&rsquo;s broad range of movements gave the research team an opportunity to test many variations that were studied using granular drag force measurements and modified Resistive Force Theory. Shrivastava and School of Physics Ph.D. candidate Andras Karsai began with the gaits explored by the NASA RP15 robot, and were able to experiment with locomotion schemes that could not have been tested on a full-size rover.</p><p>The researchers also tested their experimental gaits on slopes designed to simulate planetary and lunar hills using a fluidized bed system known as SCATTER (Systematic Creation of Arbitrary Terrain and Testing of Exploratory Robots) that could be tilted to evaluate the role of controlling the granular substrate. Karsai and Shrivastava collaborated with Yasemin Ozkan-Aydin, a postdoctoral research fellow in Goldman&rsquo;s lab, to study the rover motion in the SCATTER test facility.&nbsp;</p><p>&ldquo;By creating a small robot with capabilities similar to the RP15 rover, we could test the principles of locomoting with various gaits in a controlled laboratory environment,&rdquo; Karsai said. &ldquo;In our tests, we primarily varied the gait, the locomotion medium, and the slope the robot had to climb. We quickly iterated over many gait strategies and terrain conditions to examine the phenomena that emerged.&rdquo;</p><p>In the paper, the authors describe a gait that allowed the rover to climb a steep slope with the front wheels stirring up the granular material &ndash; poppy seeds for the lab testing &ndash; and pushing them back toward the rear wheels. The rear wheels wiggled from side-to-side, lifting and spinning to create a motion that resembles paddling in water. The material pushed to the back wheels effectively changed the slope the rear wheels had to climb, allowing the rover to make steady progress up a hill that might have stopped a simple wheeled robot.</p><p>The experiments provided a variation on earlier robophysics work in Goldman&rsquo;s group that involved moving with legs or flippers, which had emphasized disturbing the granular surfaces as little as possible to avoid getting the robot stuck.</p><p>&ldquo;In our previous studies of pure legged robots, modeled on animals, we had kind of figured out that the secret was to not make a mess,&rdquo; said Goldman. &ldquo;If you end up making too much of a mess with most robots, you end up just paddling and digging into the granular material. If you want fast locomotion, we found that you should try to keep the material as solid as possible by tweaking the parameters of motion.&rdquo;</p><p>But simple motions had proved problematic for Mars rovers, which got stuck in granular materials. Goldman says the gait discovered by Shrivastava, Karsai and Ozkan-Aydin might be able to help future rovers avoid that fate.</p><p>&ldquo;This combination of lifting and wheeling and paddling, if used properly, provides the ability to maintain some forward progress even if it is slow,&rdquo; Goldman said. &ldquo;Through our laboratory experiments, we have shown principles that could lead to improved robustness in planetary exploration &ndash; and even in challenging surfaces on our own planet.&rdquo;</p><p>The researchers hope next to scale up the unusual gaits to larger robots, and to explore the idea of studying robots and their localized environments together. &ldquo;We&rsquo;d like to think about the locomotor and its environment as a single entity,&rdquo; Goldman said. &ldquo;There are certainly some interesting granular and soft matter physics issues to explore.&rdquo;</p><p>Though the Mini Rover was designed to study lunar and planetary exploration, the lessons learned could also be applicable to terrestrial locomotion &ndash; an area of interest to the Army Research Laboratory, one of the project&rsquo;s sponsors.</p><p>&quot;This basic research is revealing exciting new approaches for locomotion in complex terrain,&quot; said Dr. Samuel Stanton, program manager, Army Research Office, an element of the U.S. Army Combat Capabilities Development Command&#39;s Army Research Laboratory. &quot;This could lead to platforms capable of intelligently transitioning between wheeled and legged modes of movement to maintain high operational tempo.&quot;</p><p>Beyond those already mentioned, the researchers worked with Robert Ambrose and William Bluethmann at NASA, and traveled to NASA JSC to study the full-size NASA RP15 rover.</p><p><em>This work was supported by the Army Research Office (W911NF-18-1-0120) and the NASA National Robotics Initiative (NNX15AR21G). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the sponsoring agencies.</em></p><p><strong>CITATION</strong>: Siddharth Shrivastava, Andras Karsai, Yasemin Ozkan-Aydin, Ross Pettinger, William Bluethmann, Robert O. Ambrose, Daniel I. Goldman, &ldquo;Material remodeling on granular terrain yields robustness benefits for a robophysical rover.&rdquo; (Science Robotics, May 2020)</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986) (jtoon@gatech.edu)</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1589379767</created>  <gmt_created>2020-05-13 14:22:47</gmt_created>  <changed>1589392259</changed>  <gmt_changed>2020-05-13 17:50:59</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Using the Mini Rover, researchers have studied locomotion techniques that could help future rovers work on granular lunar and planetary surfaces.]]></teaser>  <type>news</type>  <sentence><![CDATA[Using the Mini Rover, researchers have studied locomotion techniques that could help future rovers work on granular lunar and planetary surfaces.]]></sentence>  <summary><![CDATA[<p>Built with wheeled appendages that can be lifted and wheels able to wiggle, a new robot known as the &ldquo;Mini Rover&rdquo; has developed and tested complex locomotion techniques robust enough to help it climb hills covered with granular material &ndash; and avoid the risk of getting ignominiously stuck on some remote planet or moon.&nbsp;</p>]]></summary>  <dateline>2020-05-13T00:00:00-04:00</dateline>  <iso_dateline>2020-05-13T00:00:00-04:00</iso_dateline>  <gmt_dateline>2020-05-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>635320</item>          <item>635321</item>          <item>635322</item>          <item>635323</item>          <item>635324</item>      </media>  <hg_media>          <item>          <nid>635320</nid>          <type>image</type>          <title><![CDATA[Mini Rover moving on sand]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[mini-rover-1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/mini-rover-1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/mini-rover-1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/mini-rover-1.jpg?itok=O6UHgYQA]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mini Rover in sand]]></image_alt>                    <created>1589378228</created>          <gmt_created>2020-05-13 13:57:08</gmt_created>          <changed>1589378228</changed>          <gmt_changed>2020-05-13 13:57:08</gmt_changed>      </item>          <item>          <nid>635321</nid>          <type>image</type>          <title><![CDATA[Mini Rover moving on sand - 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[mini-rover-2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/mini-rover-2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/mini-rover-2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/mini-rover-2.jpg?itok=wS25BYpc]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mini Rover in sand]]></image_alt>                    <created>1589378378</created>          <gmt_created>2020-05-13 13:59:38</gmt_created>          <changed>1589378378</changed>          <gmt_changed>2020-05-13 13:59:38</gmt_changed>      </item>          <item>          <nid>635322</nid>          <type>image</type>          <title><![CDATA[Mini Rover in laboratory track bed]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[mini-rover-5.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/mini-rover-5.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/mini-rover-5.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/mini-rover-5.jpg?itok=taz499Lv]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mini Rover in track bed]]></image_alt>                    <created>1589378574</created>          <gmt_created>2020-05-13 14:02:54</gmt_created>          <changed>1589378574</changed>          <gmt_changed>2020-05-13 14:02:54</gmt_changed>      </item>          <item>          <nid>635323</nid>          <type>image</type>          <title><![CDATA[Mini Rover tested on simulated hill]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[mini-rover-4.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/mini-rover-4.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/mini-rover-4.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/mini-rover-4.jpg?itok=kJb-wYI9]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mini Rover in fluidized bed]]></image_alt>                    <created>1589378747</created>          <gmt_created>2020-05-13 14:05:47</gmt_created>          <changed>1589378747</changed>          <gmt_changed>2020-05-13 14:05:47</gmt_changed>      </item>          <item>          <nid>635324</nid>          <type>image</type>          <title><![CDATA[Close up of Mini Rover appendage]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[mini-rover-3.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/mini-rover-3.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/mini-rover-3.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/mini-rover-3.jpg?itok=vdgE6Cu-]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Appendage for Mini Rover]]></image_alt>                    <created>1589378900</created>          <gmt_created>2020-05-13 14:08:20</gmt_created>          <changed>1589378900</changed>          <gmt_changed>2020-05-13 14:08:20</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="184799"><![CDATA[Mini Rover]]></keyword>          <keyword tid="7057"><![CDATA[Mars]]></keyword>          <keyword tid="184802"><![CDATA[planetary exploration]]></keyword>          <keyword tid="184805"><![CDATA[lunar exploration]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="47881"><![CDATA[Dan Goldman]]></keyword>          <keyword tid="184807"><![CDATA[granular material]]></keyword>          <keyword tid="62221"><![CDATA[terradynamics]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="631809">  <title><![CDATA[Robotic Submarine Snaps First-Ever Images at Foundation of Notorious Antarctic Glacier]]></title>  <uid>31759</uid>  <body><![CDATA[<p>During an unprecedented scientific campaign on an Antarctic glacier notorious for contributions to sea-level, researchers took first-ever images at the glacier&rsquo;s foundations on the ocean floor. The area is key to Thwaites Glacier&rsquo;s potential to become more dangerous, and in the coming months, the research team hopes to give the world a clearer picture of its condition.</p><p>The images, taken by a robotic underwater vehicle, were part of a broad set of data collected in a variety of experiments by an international team. The&nbsp;<a href="https://thwaitesglacier.org/" rel="noopener noreferrer" target="_blank">International Thwaites Glacier Collaboration</a>&nbsp;(ITGC)&nbsp;<a href="https://thwaitesglacier.org/news/scientists-drill-first-time-remote-antarctic-glacier" rel="noopener noreferrer" target="_blank">announced the completion of this first-ever major research venture</a>&nbsp;on the glacier coincident with the 200-year anniversary of the discovery of Antarctica in 1820.</p><p>Already, Thwaites accounts for about four percent of global sea-level rise. Researchers have had concerns that a tipping point in the stability at its foundations could result in a run-away collapse of the glacier and boost sea levels by as much as 25 inches. By studying multiple aspects of Thwaites, the ITGC wants to understand more about the likelihood that the glacier the size of Florida may reach such instability in the coming decades.</p><h3><strong>Line of concern</strong></h3><p>The area of concern that the underwater vehicle visited is called the grounding line, and it is important to the stability of Thwaites Glacier&rsquo;s footing. It is the line between where the glacier rests on the ocean bed and where it floats over water. The farther back the grounding line recedes, the faster the ice can flow into the sea, pushing up sea-level.&nbsp;</p><p>&ldquo;Visiting the grounding line is one of the reasons work like this is important because we can drive right up to it and actually measure where it is,&rdquo; said Britney Schmidt, an ITGC co-investigator from the Georgia Institute of Technology. &ldquo;It&#39;s the first time anyone has done that or has ever even seen the grounding zone of a major glacier under the water, and that&rsquo;s the place where the greatest degree of melting and destabilization can occur.&rdquo;</p><p>The underwater robot,&nbsp;<a href="https://schmidt.eas.gatech.edu/icefin/" rel="noopener noreferrer" target="_blank">Icefin, was engineered by Schmidt&rsquo;s Georgia Tech lab</a>. The Georgia Tech team was part of a greater collaboration between researchers from the U.S. and the British Antarctic Survey (BAS), who lived and worked on Thwaites in December and January. A BAS hot water drill melted a hole 590 meters deep (1,935 feet) to access the ocean cavity for Icefin.</p><p>&ldquo;Icefin swam over 15 km (9.3 miles) round trip during five missions.&nbsp;This included two passes up to the grounding zone, including one where we got as close as we physically could to the place where the seafloor meets the ice,&rdquo; said Schmidt, who is&nbsp;<a href="https://schmidt.eas.gatech.edu/" rel="noopener noreferrer" target="_blank">an associate professor in Georgia Tech&rsquo;s School of Earth and Atmospheric Sciences</a>. &ldquo;We saw amazing ice interactions driven by sediments at the line and from the rapid melting from warm ocean water.&rdquo;</p><p><sup><em>[Ready for graduate school?&nbsp;<a href="http://www.gradadmiss.gatech.edu/apply-now" target="_blank">Here&#39;s how to apply to Georgia Tech.</a>]&nbsp;</em></sup></p><h3><strong>Historic research venture</strong></h3><p>In the coming months and years, the ITGC team made up of researchers from multiple universities and research institutions in the U.S. and the UK will publish studies with thorough findings based on the unprecedented data collected during the field campaign.</p><p>The array of research the scientists carried out research included seismic and radar measurements and using hot water drills to make holes between 300 and 700 meters (985 and 2,300 feet) deep down to the ocean and glacier bed below Thwaites&rsquo; ice. Researchers also took cores of sediment from the seafloor and under parts of the glacier grounded on the bed to examine the quality of the foothold that it offers Thwaites.</p><p>&ldquo;We know that warmer ocean waters are eroding many of West Antarctica&rsquo;s glaciers, but we&rsquo;re particularly concerned about Thwaites. This new data will provide a new perspective of the processes taking place, so we can predict future change with more certainty,&rdquo; said Keith Nicholls, an oceanographer from the British Antarctic Survey.</p><p>Nicholls is a co-principal investigator on the project that involved Schmidt along with David Holland of New York University. The research is funded by the National Science Foundation, the UK Natural Environment Research Council, the U.S. Antarctic Program, and the British Antarctic Survey.</p><h3><strong>Antarctica sea-level background</strong></h3><p>Over the past 30 years, the amount of ice flowing to the sea from Thwaites and its neighboring glaciers has nearly doubled.</p><p>&ldquo;While Greenland&#39;s contribution to sea level has already reached an alarming rate, Antarctica is just now picking up its contributions to sea level,&rdquo; Schmidt said. &ldquo;It has the largest body of ice on Earth and will contribute more and more of sea-level rise over the next 100 years and beyond. It&rsquo;s a massive source of uncertainty in the climate system.&rdquo;</p><p><strong>Watch</strong>&nbsp;<a href="https://www.youtube.com/watch?v=f0AWsJ0cmLE" target="_blank">BBC News report on this research</a>.</p><p><strong>External News Coverage:&nbsp;</strong></p><p>BBC News-&nbsp;<a href="https://www.bbc.com/news/science-environment-51097309?ocid=socialflow_twitter">Antarctica melting: Climate change and the journey to the &#39;doomsday glacier&#39;&nbsp;</a></p><p>The Atlantic- <a href="https://www.theatlantic.com/science/archive/2020/01/watch-video-one-worlds-most-important-places/605731/?utm_content=edit-promo&amp;utm_source=twitter&amp;utm_campaign=the-atlantic&amp;utm_medium=social&amp;utm_term=2020-01-30T14%3A00%3A33">The New Video of One of the Scariest Places on Earth</a></p><p>The Washington Post-&nbsp;<a href="https://www.washingtonpost.com/climate-environment/2020/01/30/unprecedented-data-confirm-that-antarcticas-most-dangerous-glacier-is-melting-below/">Unprecedented data confirms that Antarctica&rsquo;s most dangerous glacier is melting from below</a></p><p>BBC Newsround-&nbsp;<a href="https://www.bbc.co.uk/newsround/51268527">Climate change: Scientists concerned about future of Antarctic glacier</a></p><p>Daily Mail Online-&nbsp;<a href="https://www.dailymail.co.uk/sciencetech/article-7938183/Scientists-drilled-Antarcticas-doomsday-Thwaites-glacier.html">Scientists drill into Antarctica&#39;s &#39;doomsday&#39; Thwaites glacier for the first time in a bid to stop dramatic sea level rise as the ice shelf the size of BRITAIN melts at an alarming rate</a></p><p>Yahoo News-&nbsp;<a href="https://uk.news.yahoo.com/thwaites-glacier-antarctica-185028043.html?guccounter=1&amp;guce_referrer=aHR0cDovL3RyYW5zaXRpb24ubWVsdHdhdGVyLmNvbS9yZWRpcmVjdD91cmw9aHR0cHMlM0ElMkYlMkZ1ay5uZXdzLnlhaG9vLmNvbSUyRnRod2FpdGVzLWdsYWNpZXItYW50YXJjdGljYS0xODUwMjgwNDMuaHRtbCZ0cmFuc2l0aW9uVG9rZW49ZXlKMGVYQWlPaUpLVjFRaUxDSmhiR2NpT2lKSVV6VXhNaUo5LmV5Sm9iM04wYm1GdFpTSTZJblZyTG01bGQzTXVlV0ZvYjI4dVkyOXRJbjAuTkJQT2J3U3VMcFNUNEVUa180ak1yQTI4eUl4QXRiWjJvbUtUS0FhdWk1akJmMFlDbU1nZGZUZGttWHU1UTRWc2lRZXBjWlB5dnRKVWVFeVlpX0dpUVE&amp;guce_referrer_sig=AQAAAIRM-4giOYbmjW1hRxQ4iZ-18X61yqEBJCY4ITCFbBFdWvtWtBSNEfakpuj_hrNCwh3OrXO-FRFuyJabFIBmLQhdjng1A9-dgzaxtFWIJnMz5tZGzEv5kS-aEHKOwZ4vESHlK501McjqvhE70gDBlzsMnwR5R20orgdJK9UMYLqI">Scientists drill into &lsquo;doomsday glacier&rsquo; the size of Britain to see if it&rsquo;s going to collapse</a></p><p>Fox News-&nbsp;<a href="https://www.foxnews.com/science/antarctica-doomsday-glacier-alarming-new-trait">Antarctica&rsquo;s &lsquo;doomsday glacier&rsquo; reveals alarming new trait to scientists</a></p><p>Cosmos Magazine- <a href="https://cosmosmagazine.com/climate/here-s-what-s-below-an-unstable-glacier">Here&#39;s what&#39;s below an unstable glacier</a></p><p>PBS Newshour- <a href="https://www.pbs.org/newshour/show/visiting-the-most-vulnerable-place-on-earth-the-doomsday-glacier">A risky expedition to study the &lsquo;doomsday glacier&rsquo;</a>&nbsp;</p><p>NOVA Next-&nbsp;<a href="https://www.pbs.org/wgbh/nova/article/warm-water-found-beneath-thwaites-glacier-antarctica/">Scientists find warm water beneath Antarctica&rsquo;s most at-risk glacier</a></p><p><strong>More reading:</strong>&nbsp;<a href="https://rh.gatech.edu/news/623053/instability-antarctic-ice-projected-make-sea-level-rise-rapidly" target="_blank">Instability in Antarctic Ice Projected to Make Sea Level Rise Rapidly</a>&nbsp;<strong>and</strong></p><p><a href="https://rh.gatech.edu/news/628264/reframing-antarcticas-meltwater-pond-dangers-ice-shelves-and-sea-level" target="_blank">Reframing Antarctica&rsquo;s Meltwater Pond Dangers to Ice Shelves and Sea Level</a></p><p><em>Any findings, conclusions, or recommendations are those of the authors and not necessarily of the sponsors.</em></p><p><strong>Writer &amp;&nbsp;Media Representative</strong>: Ben Brumfield (404-272-2780), email:&nbsp;<a href="mailto:ben.brumfield@comm.gatech.edu">ben.brumfield@comm.gatech.edu</a></p><p><strong>Georgia Institute of Technology</strong></p>]]></body>  <author>Ben Brumfield</author>  <status>1</status>  <created>1580307226</created>  <gmt_created>2020-01-29 14:13:46</gmt_created>  <changed>1587743986</changed>  <gmt_changed>2020-04-24 15:59:46</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[These are the first-ever images taken at the foundations of the glacier that inspires more fear of sea-level rise than any other - Thwaites Glacier.]]></teaser>  <type>news</type>  <sentence><![CDATA[These are the first-ever images taken at the foundations of the glacier that inspires more fear of sea-level rise than any other - Thwaites Glacier.]]></sentence>  <summary><![CDATA[<p>These are the first-ever images taken at the foundations of the glacier that inspires more fear of sea-level rise than any other - Thwaites Glacier. Its&nbsp;grounding line is integral to Thwaites&#39; fate and that of the world&#39;s coastlines, and an underwater vehicle from the Georgia Institute of Technology has made the&nbsp;first-ever visit to it as a part of the historic International Thwaites Glacier Collaboration.</p>]]></summary>  <dateline>2020-01-29T00:00:00-05:00</dateline>  <iso_dateline>2020-01-29T00:00:00-05:00</iso_dateline>  <gmt_dateline>2020-01-29 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>623047</item>          <item>631805</item>          <item>631804</item>          <item>631807</item>          <item>631806</item>          <item>623049</item>          <item>631808</item>      </media>  <hg_media>          <item>          <nid>623047</nid>          <type>image</type>          <title><![CDATA[Thwaites Glacier's outer edge]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ThwaitesGlacier20170530.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ThwaitesGlacier20170530.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ThwaitesGlacier20170530.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ThwaitesGlacier20170530.jpg?itok=fZCaCYI1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1562610337</created>          <gmt_created>2019-07-08 18:25:37</gmt_created>          <changed>1580307455</changed>          <gmt_changed>2020-01-29 14:17:35</gmt_changed>      </item>          <item>          <nid>631805</nid>          <type>image</type>          <title><![CDATA[Britney Schmidt with Icefin after last Thwaites dive]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ddichek-0056.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ddichek-0056.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ddichek-0056.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ddichek-0056.jpg?itok=Yfi7KJw-]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1580304570</created>          <gmt_created>2020-01-29 13:29:30</gmt_created>          <changed>1580304570</changed>          <gmt_changed>2020-01-29 13:29:30</gmt_changed>      </item>          <item>          <nid>631804</nid>          <type>image</type>          <title><![CDATA[Thwaites Glacier grounding line]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Icefin_GZ.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Icefin_GZ.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Icefin_GZ.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Icefin_GZ.jpg?itok=B8-JO78l]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1580304373</created>          <gmt_created>2020-01-29 13:26:13</gmt_created>          <changed>1580308039</changed>          <gmt_changed>2020-01-29 14:27:19</gmt_changed>      </item>          <item>          <nid>631807</nid>          <type>image</type>          <title><![CDATA[Thwaites Glacier research camp]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ddichek-9579.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ddichek-9579.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ddichek-9579.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ddichek-9579.jpg?itok=_9ssz4N-]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1580305975</created>          <gmt_created>2020-01-29 13:52:55</gmt_created>          <changed>1580305975</changed>          <gmt_changed>2020-01-29 13:52:55</gmt_changed>      </item>          <item>          <nid>631806</nid>          <type>image</type>          <title><![CDATA[Icefin and team on Thwaites]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ddichek-0060.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ddichek-0060.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ddichek-0060.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ddichek-0060.jpg?itok=zCzaKW_C]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1580305341</created>          <gmt_created>2020-01-29 13:42:21</gmt_created>          <changed>1580305341</changed>          <gmt_changed>2020-01-29 13:42:21</gmt_changed>      </item>          <item>          <nid>623049</nid>          <type>image</type>          <title><![CDATA[Glacier grounding line diagram]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Fig-2.-Grounding-line.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Fig-2.-Grounding-line.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Fig-2.-Grounding-line.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Fig-2.-Grounding-line.jpg?itok=FFHzZVJ1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1562610606</created>          <gmt_created>2019-07-08 18:30:06</gmt_created>          <changed>1580308347</changed>          <gmt_changed>2020-01-29 14:32:27</gmt_changed>      </item>          <item>          <nid>631808</nid>          <type>image</type>          <title><![CDATA[Thwaites grounding zone, sediment in the ice]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Icefin_GZ_ice.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Icefin_GZ_ice.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Icefin_GZ_ice.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Icefin_GZ_ice.jpg?itok=Tk6Sozzj]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1580306212</created>          <gmt_created>2020-01-29 13:56:52</gmt_created>          <changed>1580306212</changed>          <gmt_changed>2020-01-29 13:56:52</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="181645"><![CDATA[Thwaites Glacier]]></keyword>          <keyword tid="82391"><![CDATA[Antarctica]]></keyword>          <keyword tid="183751"><![CDATA[grounding line]]></keyword>          <keyword tid="183752"><![CDATA[grounding zone]]></keyword>          <keyword tid="183753"><![CDATA[Instability]]></keyword>          <keyword tid="183754"><![CDATA[autonomous undersea vehicles]]></keyword>          <keyword tid="183755"><![CDATA[Autonomous Underwater Vehicle]]></keyword>          <keyword tid="95691"><![CDATA[auv]]></keyword>          <keyword tid="183756"><![CDATA[Autonomous Underwater Vehicles (Auvs)]]></keyword>          <keyword tid="183757"><![CDATA[Sea-level rise]]></keyword>          <keyword tid="183758"><![CDATA[Sealevel]]></keyword>          <keyword tid="168986"><![CDATA[sea level rise]]></keyword>          <keyword tid="831"><![CDATA[climate change]]></keyword>          <keyword tid="182534"><![CDATA[Global Warming Climate Change]]></keyword>          <keyword tid="182535"><![CDATA[Global Warming Research]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71911"><![CDATA[Earth and Environment]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="633018">  <title><![CDATA[Shriners Hospitals for Children and Georgia Tech Announce Research Affiliation ]]></title>  <uid>27303</uid>  <body><![CDATA[<p>You see and want the glass of milk on the table across the room. That&rsquo;s no problem for most of us, who will simply walk to the table, grab the glass, and enjoy the milk. Triggering all of that limb movement is a complex set of coordinated neuromuscular commands and actions, which are not so simple for that segment of the population with, say, cerebral palsy or spinal cord injury.</p><p>To help young people struggling with those conditions &ndash; or orthopedic problems like clubfoot, scoliosis, and osteogenesis imperfecta, among other things &ndash; Shriners Hospitals for Children&reg; and the Georgia Institute of Technology have launched an ambitious collaborative research effort to address these conditions, including the development of devices to facilitate limb movement and function.</p><p>The new research affiliation brings together the clinical, surgical, and scientific expertise of Shriners Hospitals for Children physicians and researchers with Georgia Tech&rsquo;s cutting-edge expertise in biomedical engineering, robotics, and device development. The coordinated effort also will leverage the two organizations&rsquo; proficiency in big data and artificial intelligence tools for personalized medicine, according to Marc Lalande, Ph.D., vice president of research programs for Shriners Hospitals for Children.</p><p>&ldquo;Our joint goals, through genetic and genomic data gathered by Shriners Hospitals for Children, are to improve patient therapeutic responses by optimizing individualized treatment regimens and reducing adverse events,&rdquo; Lalande said.</p><p>Several joint projects already are underway.</p><p><a href="https://bme.gatech.edu/bme/faculty/Jaydev-Desai">Jaydev Desai</a>, professor in the <a href="http://www.bme.gatech.edu">Wallace H. Coulter Department of Biomedical Engineering</a> (BME) at Georgia Tech and Emory University, is working with Scott Kozin, M.D., chief of staff and hand surgeon at Shriners Hospitals for Children-Philadelphia, on a wearable customized robotic exoskeleton with voice recognition for children with cervical spine injury.</p><p>&ldquo;This is a patient specific system for kids with spinal cord injury,&rdquo; explained Desai, who is director of the <a href="https://medicalrobotics.gatech.edu/">Georgia Center for Medical Robotics</a> and associate director of Georgia Tech&rsquo;s <a href="http://www.robotics.gatech.edu/">Institute for Robotics and Intelligent Machines</a>. &ldquo;The system is designed to translate voice commands into actions, meaning the exoskeleton will conform to the proper shape and posture of the fingers, so to speak, depending on the task. The idea is to enhance the child&rsquo;s ability to perform the activities of daily living.&rdquo;</p><p>Kozin expects his patients with spinal cord injuries will benefit from Georgia Tech&rsquo;s innovative pediatric prosthesis development &ndash; its utility, actuation, and dexterity. &ldquo;Alternative pathways for the recovery of sensation will enhance their function and independence. We are excited about this new collaboration combining institutions with similar missions and visions devoted to improving the lives of children,&rdquo; said Kozin, who also is collaborating with Georgia Tech&rsquo;s <a href="https://bme.gatech.edu/bme/faculty/Frank-L.-Hammond%20III">Frank Hammond</a> (assistant professor in BME and mechanical engineering) on wearable sensory transfer devices for patients with diminished peripheral sensation or amputations, improving their ability to use intuitively powered prostheses and orthoses.&nbsp;</p><p>Additionally, <a href="http://www.me.gatech.edu/faculty/young">Aaron Young</a>, assistant professor in the <a href="http://www.me.gatech.edu">George W. Woodruff School of Mechanical Engineering</a> at Georgia Tech, is working with David Westberry, M.D., pediatric orthopedic surgeon at Shriners Hospitals for Children-Greenville, on a smart robotic exoskeleton designed to address excessive knee flexion (crouch gait), a condition common in patients with cerebral palsy. The condition can lead to permanent joint deformity if untreated, as well as reduced independence and locomotion capability.</p><p>&ldquo;The device is basically a lightweight, wearable robot designed to assist physical therapists working on pediatric mobility &ndash; the idea is to essentially retrain the child&rsquo;s neuroplasticity,&rdquo; said Young, who is testing the device with Westberry at Shriners Hospitals for Children-Greenville in South Carolina. &ldquo;The exciting thing about Shriners Hospitals for Children-Greenville is that it has an advanced motion analysis center where Shriners&rsquo; physicians and researchers are looking at not just the child&rsquo;s gait, but also at the internal mechanics. It&rsquo;s very rewarding to collaborate with the Shriners team &ndash; they are very quantitative in their approach to treatment.&rdquo;</p><p>That quantitative approach includes the integration of biomedical informatics, data science, and artificial intelligence into the clinical research programs of the Shriners Hospitals for Children network of 14 pediatric motion analysis centers and the healthcare system&rsquo;s newly launched Genomics Institute. As part of this process, researchers are collaborating with <a href="https://www.bme.gatech.edu/bme/faculty/May-Dongmei-Wang">Dongmei Wang</a>, BME professor at Georgia Tech, where she is director of the Biomedical Informatics and Bioimaging Lab.</p><p>&ldquo;This collaboration is extremely important for us because not only have we committed to work on a major national need in youth health, but also because we have been planning to establish a pediatric big data center using advanced IT and AI,&rdquo; said Wang, whose collaborators at Shriners Hospitals for Children include Gerald Harris (Motion Analysis, Shriners Hospitals for Children-Chicago) and Kamran Shazand (Shriners Hospitals for Children Genomics Institute, Tampa, Florida).&nbsp;</p><p>&ldquo;Our lab has piloted multiple pediatric projects,&rdquo; Wang said. &ldquo;But this project represents a quantum leap, taking our work to the next level, in a real-world pediatric care setting. Shriners Hospitals for Children is a perfect fit for us.&rdquo;</p><p>Leanne West, Georgia Tech&rsquo;s chief engineer of <a href="https://ptc.gatech.edu/">pediatric technologies</a>, said she&rsquo;s looking forward to &ldquo;the unique research opportunities this relationship with Shriners Hospitals for Children will provide. It will be exciting to see what is possible for us to achieve together.&rdquo;</p><p><strong>About pediatric device research at Georgia Tech</strong><br />Georgia Tech&rsquo;s wide-ranging efforts in pediatric device development brings the institute&rsquo;s engineers and scientists together with clinical experts and researchers to develop innovative technological solutions to problems in the health and care of children. The work provides opportunities for interdisciplinary collaboration in pediatrics, creating breakthrough discoveries, enhancing the lives of children and young adults.</p><p><strong>About Shriners Hospitals for Children&nbsp;</strong><br />Shriners Hospitals for Children is changing lives every day through innovative pediatric specialty care, world-class research, and outstanding medical education. Its healthcare system provides care for children with orthopedic conditions, burns, spinal cord injuries, and cleft lip and palate. All care and services are provided regardless of families&rsquo; ability to pay. Since opening its first location in 1922, the healthcare system has treated more than 1.4 million children. For more information, visit <a href="http://shrinershospitalsforchildren.org">shrinershospitalsforchildren.org</a>.</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986) (jtoon@gatech.edu).</p><p><strong>Writer</strong>: Jerry Grillo</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1582767846</created>  <gmt_created>2020-02-27 01:44:06</gmt_created>  <changed>1582768027</changed>  <gmt_changed>2020-02-27 01:47:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new collaborative research effort will help children with cerebral palsy, spinal cord injury and other conditions.]]></teaser>  <type>news</type>  <sentence><![CDATA[A new collaborative research effort will help children with cerebral palsy, spinal cord injury and other conditions.]]></sentence>  <summary><![CDATA[<p>You see and want the glass of milk on the table across the room. That&rsquo;s no problem for most of us, who will simply walk to the table, grab the glass, and enjoy the milk. Triggering all of that limb movement is a complex set of coordinated neuromuscular commands and actions, which are not so simple for that segment of the population with, say, cerebral palsy or spinal cord injury.</p>]]></summary>  <dateline>2020-02-26T00:00:00-05:00</dateline>  <iso_dateline>2020-02-26T00:00:00-05:00</iso_dateline>  <gmt_dateline>2020-02-26 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>633015</item>          <item>633016</item>          <item>633017</item>      </media>  <hg_media>          <item>          <nid>633015</nid>          <type>image</type>          <title><![CDATA[Pediatric Knee Exoskeleton]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[shriners-004.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/shriners-004.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/shriners-004.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/shriners-004.jpg?itok=ej2soAD7]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Testing pediatric knee exoskeleton]]></image_alt>                    <created>1582766700</created>          <gmt_created>2020-02-27 01:25:00</gmt_created>          <changed>1582766700</changed>          <gmt_changed>2020-02-27 01:25:00</gmt_changed>      </item>          <item>          <nid>633016</nid>          <type>image</type>          <title><![CDATA[Pediatric Knee Exoskeleton2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[shriners-006.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/shriners-006.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/shriners-006.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/shriners-006.jpg?itok=JZDB81Kn]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Pediatric knee exoskeleton]]></image_alt>                    <created>1582766835</created>          <gmt_created>2020-02-27 01:27:15</gmt_created>          <changed>1582766835</changed>          <gmt_changed>2020-02-27 01:27:15</gmt_changed>      </item>          <item>          <nid>633017</nid>          <type>image</type>          <title><![CDATA[Researcher Aaron Young]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[shriners-008.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/shriners-008.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/shriners-008.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/shriners-008.jpg?itok=cpY3v4my]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researcher Aaron Young with pediatric knee exoskeleton]]></image_alt>                    <created>1582766961</created>          <gmt_created>2020-02-27 01:29:21</gmt_created>          <changed>1582766961</changed>          <gmt_changed>2020-02-27 01:29:21</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="2585"><![CDATA[pediatric]]></keyword>          <keyword tid="179123"><![CDATA[pediatric technology]]></keyword>          <keyword tid="89521"><![CDATA[Exoskeleton]]></keyword>          <keyword tid="172346"><![CDATA[Pediatric Technology Center]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71891"><![CDATA[Health and Medicine]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="604462">  <title><![CDATA[Robot Designed to Defend Factories Against Cyberthreats]]></title>  <uid>31758</uid>  <body><![CDATA[<p>It&rsquo;s small enough to fit inside a shoebox, yet this robot on four wheels has a big mission: keeping factories and other large facilities safe from hackers.</p><p>Meet the HoneyBot.&nbsp;</p><p>Developed by a team of researchers at the Georgia Institute of Technology, the diminutive device is designed to lure in digital troublemakers who have set their sights on industrial facilities. HoneyBot will then trick the bad actors into giving up valuable information to cybersecurity professionals.</p><p>The decoy robot arrives as more and more devices &ndash; never designed to operate on the Internet &ndash; are coming online in homes and factories alike, opening up a new range of possibilities for hackers looking to wreak havoc in both the digital and physical world.</p><p>&ldquo;Robots do more now than they ever have, and some companies are moving forward with, not just the assembly line robots, but free-standing robots that can actually drive around factory floors,&rdquo; said Raheem Beyah, the Motorola Foundation Professor and interim Steve W. Chaddick School Chair in Georgia Tech&rsquo;s School of Electrical and Computer Engineering. &ldquo;In that type of setting, you can imagine how dangerous this could be if a hacker gains access to those machines. At a minimum, they could cause harm to whatever products are being produced. If it&rsquo;s a large enough robot, it could destroy parts or the assembly line. In a worst-case scenario, it could injure or cause death to the humans in the vicinity.&rdquo;</p><p>Internet security professionals long have employed decoy computer systems known as &ldquo;honeypots&rdquo; as a way to throw cyberattackers off the trail. The research team applied the same concept to the HoneyBot, which is partially funded with a grant from the National Science Foundation. Once hackers gain access to the decoy, they leave behind valuable information that can help companies further secure their networks.</p><p>&ldquo;A lot of cyberattacks go unanswered or unpunished because there&rsquo;s this level of anonymity afforded to malicious actors on the internet, and it&rsquo;s hard for companies to say who is responsible,&rdquo; said Celine Irvene, a Georgia Tech graduate student who worked with Beyah to devise the new robot. &ldquo;Honeypots give security professionals the ability to study the attackers, determine what methods they are using, and figure out where they are or potentially even who they are.&rdquo;</p><p>The gadget can be monitored and controlled through the internet. But unlike other remote-controlled robots, the HoneyBot&rsquo;s special ability is tricking its operators into thinking it is performing one task, when in reality it&rsquo;s doing something completely different.</p><p>&ldquo;The idea behind a honeypot is that you don&rsquo;t want the attackers to know they&rsquo;re in a honeypot,&rdquo; Beyah said. &ldquo;If the attacker is smart and is looking out for the potential of a honeypot, maybe they&rsquo;d look at different sensors on the robot, like an accelerometer or speedometer, to verify the robot is doing what it had been instructed. That&rsquo;s where we would be spoofing that information as well. The hacker would see from looking at the sensors that acceleration occurred from point A to point B.&rdquo;</p><p>In a factory setting, such a HoneyBot robot could sit motionless in a corner, springing to life when a hacker gains access &ndash; a visual indicator that a malicious actor is targeting the facility.</p><p>Rather than allowing the hacker to then run amok in the physical world, the robot could be designed to follow certain commands deemed harmless &ndash; such as meandering slowly about or picking up objects &ndash; but stopping short of actually doing anything dangerous.</p><p>So far, their technique seems to be working.</p><p>In experiments designed to test how convincing the false sensor data would be to individuals remotely controlling the device, volunteers in December 2017 used a virtual interface to control the robot and could not to see what was happening in real life. To entice the volunteers to break the rules, at specific spots within the maze, they encountered forbidden &ldquo;shortcuts&rdquo; that would allow them to finish the maze faster.</p><p>In the real maze back in the lab, no shortcut existed, and if the participants opted to go through it, the robot instead remained still. Meanwhile, the volunteers &ndash; who have now unwittingly become hackers for the purposes of the experiment &ndash; were fed simulated sensor data indicating they passed through the shortcut and continued along.</p><p>&ldquo;We wanted to make sure they felt that this robot was doing this real thing,&rdquo; Beyah said.</p><p>In surveys after the experiment, participants who actually controlled the device the whole time and those who were being fed simulated data about the fake shortcut both indicated that the data was believable at similar rates.</p><p>&ldquo;This is a good sign because it indicates that we&rsquo;re on the right track,&rdquo; Irvene said.</p><p><em>This material is based upon work supported by the National Science Foundation under Grant No. 1544332. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.</em></p>]]></body>  <author>Josh Brown</author>  <status>1</status>  <created>1522345201</created>  <gmt_created>2018-03-29 17:40:01</gmt_created>  <changed>1578410170</changed>  <gmt_changed>2020-01-07 15:16:10</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[This robot is designed to lure in digital troublemakers who have set their sights on industrial facilities. HoneyBot will then trick the bad actors into giving up valuable information to cybersecurity professionals. ]]></teaser>  <type>news</type>  <sentence><![CDATA[This robot is designed to lure in digital troublemakers who have set their sights on industrial facilities. HoneyBot will then trick the bad actors into giving up valuable information to cybersecurity professionals. ]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2018-03-29T00:00:00-04:00</dateline>  <iso_dateline>2018-03-29T00:00:00-04:00</iso_dateline>  <gmt_dateline>2018-03-29 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[john.toon@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:john.toon@comm.gatech.edu">John Toon</a></p><p>Research News</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>604473</item>          <item>604468</item>      </media>  <hg_media>          <item>          <nid>604473</nid>          <type>image</type>          <title><![CDATA[HoneyBot Robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Screen Shot 2018-03-29 at 2.42.49 PM.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Screen%20Shot%202018-03-29%20at%202.42.49%20PM.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Screen%20Shot%202018-03-29%20at%202.42.49%20PM.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Screen%2520Shot%25202018-03-29%2520at%25202.42.49%2520PM.png?itok=k8nj58Ac]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1522349112</created>          <gmt_created>2018-03-29 18:45:12</gmt_created>          <changed>1522349141</changed>          <gmt_changed>2018-03-29 18:45:41</gmt_changed>      </item>          <item>          <nid>604468</nid>          <type>image</type>          <title><![CDATA[Raheem Beyah and Celine Irvene]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Fixer2sm.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Fixer2sm.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Fixer2sm.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Fixer2sm.jpg?itok=-OqRem6l]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1522348166</created>          <gmt_created>2018-03-29 18:29:26</gmt_created>          <changed>1522348166</changed>          <gmt_changed>2018-03-29 18:29:26</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="8862"><![CDATA[Student Research]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="8862"><![CDATA[Student Research]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="1404"><![CDATA[Cybersecurity]]></keyword>          <keyword tid="177568"><![CDATA[cyber physical systems]]></keyword>          <keyword tid="67741"><![CDATA[Raheem Beyah]]></keyword>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>          <term tid="39461"><![CDATA[Manufacturing, Trade, and Logistics]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="626381">  <title><![CDATA[Shape-Shifting Robot Built from “Smarticles” Shows New Locomotion Strategy]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Building conventional robots typically requires carefully combining components like motors, batteries, actuators, body segments, legs and wheels. Now, researchers have taken a new approach, building a robot entirely from smaller robots known as &ldquo;smarticles&rdquo; to unlock the principles of a potentially new locomotion technique.</p><p>The 3D-printed smarticles &mdash; short for smart active particles &mdash; can do just one thing: flap their two arms. But when five of these smarticles are confined in a circle, they begin to nudge one another, forming a robophysical system known as a &ldquo;supersmarticle&rdquo; that can move by itself. Adding a light or sound sensor allows the supersmarticle to move in response to the stimulus &mdash; and even be controlled well enough to navigate a maze.</p><p>Though rudimentary now, the notion of making robots from smaller robots &mdash; and taking advantage of the group capabilities that arise by combining individuals &mdash; could provide mechanically based control over very small robots. Ultimately, the emergent behavior of the group could provide a new locomotion and control approach for small robots that could potentially change shapes.</p><p>&ldquo;These are very rudimentary robots whose behavior is dominated by mechanics and the laws of physics,&rdquo; said <a href="http://www.physics.gatech.edu/user/daniel-goldman">Dan Goldman</a>, a Dunn Family Professor in the <a href="http://www.physics.gatech.edu">School of Physics</a> at the Georgia Institute of Technology. &ldquo;We are not looking to put sophisticated control, sensing, and computation on them all. As robots become smaller and smaller, we&rsquo;ll have to use mechanics and physics principles to control them because they won&rsquo;t have the level of computation and sensing we would need for conventional control.&rdquo;</p><p>The research, which was supported by the Army Research Office and the National Science Foundation, was reported September 18 in the journal <em>Science Robotics</em>. Researchers from Northwestern University also contributed to the project.</p><p>The foundation for the research came from an unlikely source: a study of construction staples. By pouring these heavy-duty staples into a container with removable sides, former Ph.D. student Nick Gravish &mdash; now a faculty member at the University of California San Diego &mdash; created structures that would stand by themselves after the container&rsquo;s walls were removed.</p><p>Shaking the staple towers eventually caused them to collapse, but the observations led to a realization that simple entangling of mechanical objects could create structures with capabilities well beyond those of the individual components.&nbsp;</p><p>&ldquo;A robot made of other rudimentary robots became the vision,&rdquo; Goldman said. &ldquo;You could imagine making a robot in which you would tweak its geometric parameters a bit and what emerges is qualitatively new behaviors.&rdquo;</p><p>To explore the concept, graduate research assistant Will Savoie used a 3D printer to create battery-powered smarticles, which have motors, simple sensors, and limited computing power. The devices can change their location only when they interact with other devices while enclosed by a ring.</p><p>&ldquo;Even though no individual robot could move on its own, the cloud composed of multiple robots could move as it pushed itself apart and shrink as it pulled itself together,&rdquo; Goldman explained. &ldquo;If you put a ring around the cloud of little robots, they start kicking each other around, and the larger ring &mdash; what we call a supersmarticle &mdash; moves around randomly.&rdquo;</p><p>The researchers noticed that if one small robot stopped moving, perhaps because its battery died, the group of smarticles would begin moving in the direction of that stalled robot. Graduate student Ross Warkentin learned he could control the movement by adding photo sensors to the robots that halt the arm flapping when a strong beam of light hits one of them.</p><p>&ldquo;If you angle the flashlight just right, you can highlight the robot you want to be inactive, and that causes the ring to lurch toward or away from it, even though no robots are programmed to move toward the light,&rdquo; Goldman said. &ldquo;That allowed steering of the ensemble in a very rudimentary, stochastic way.&rdquo;</p><p>School of Physics Professor Kurt Wiesenfeld and graduate student Zack Jackson modeled the movement of the these smarticles and supersmarticles to understand how the nudges and mass of the ring affected overall movement. Researchers from Northwestern University studied how the interactions between the smarticles provided directional control.</p><p>&quot;For many robots, we have electrical current move motors that generate forces on parts that collectively move a robot reliably,&rdquo; said <a href="https://www.mccormick.northwestern.edu/research-faculty/directory/profiles/murphey-todd.html">Todd Murphey</a>, a professor of mechanical engineering who worked with Northwestern graduate students Thomas Berrueta and Ana Pervan. &ldquo;We learned that although individual smarticles interact with each other through a chaos of wiggling impacts that are each unpredictable, the whole robot composed of those smarticles moves predictably and in a way that we can exploit in software.&quot;</p><p>In future work, Goldman envisions more complex interactions that utilize the simple sensing and movement capabilities of the smarticles. &ldquo;People have been interested in making a certain kind of swarm robots that are composed of other robots,&rdquo; he said. &ldquo;These structures could be reconfigured on demand to meet specific needs by tweaking their geometry.&rdquo;</p><p>The project is of interest to the U.S. Army because it could lead to new robotic systems capable of changing their shapes, modalities and functions, said Sam Stanton. He is program manager of complex dynamics and systems at the Army Research Office, an element of U.S. Army Combat Capabilities Development Command&rsquo;s Army Research Laboratory.</p><p>&ldquo;Future Army unmanned systems and networks of systems are imagined to be capable of transforming their shape, modality, and function. For example, a robotic swarm may someday be capable of moving to a river and then autonomously forming a structure to span the gap,&rdquo; Stanton said. &ldquo;Dan Goldman&#39;s research is identifying physical principles that may prove essential for engineering emergent behavior in future robot collectives as well as new understanding of fundamental tradeoffs in system performance, responsiveness, uncertainty, resiliency, and adaptivity.&rdquo;</p><p>In addition to those already mentioned, the research also included Georgia Tech graduate student Shengkai Li.</p><p><em>This material is based upon work supported by the Army Research Office under award W911NF-13-1-0347 and by the National Science Foundation under grants PoLS-0957659, PHY-1205878, DMR-1551095, PHY-1205878. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the sponsoring agencies.</em></p><p><strong>CITATION</strong>: William Savoie, et al., &ldquo;A robot made of robots: emergent transport and control of a smarticle ensemble,&rdquo; (Science Robotics 2019).&nbsp;</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Assistance</strong>: John Toon (404-894-6986) (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1568836455</created>  <gmt_created>2019-09-18 19:54:15</gmt_created>  <changed>1568836658</changed>  <gmt_changed>2019-09-18 19:57:38</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have built a robot entirely from smaller robots known as "smarticles."]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have built a robot entirely from smaller robots known as "smarticles."]]></sentence>  <summary><![CDATA[<p>Building conventional robots typically requires carefully combining components like motors, batteries, actuators, body segments, legs and wheels. Now, researchers have taken a new approach, building a robot entirely from smaller robots known as &ldquo;smarticles&rdquo; to unlock the principles of a potentially new locomotion technique.</p>]]></summary>  <dateline>2019-09-18T00:00:00-04:00</dateline>  <iso_dateline>2019-09-18T00:00:00-04:00</iso_dateline>  <gmt_dateline>2019-09-18 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>626375</item>          <item>626376</item>          <item>626377</item>          <item>626378</item>          <item>626379</item>      </media>  <hg_media>          <item>          <nid>626375</nid>          <type>image</type>          <title><![CDATA[Close-up of Smart Active Particle]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[smarticles-003.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/smarticles-003.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/smarticles-003.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/smarticles-003.jpg?itok=aQJcuKjC]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Photo of smart active particle (smarticle)]]></image_alt>                    <created>1568835251</created>          <gmt_created>2019-09-18 19:34:11</gmt_created>          <changed>1568835251</changed>          <gmt_changed>2019-09-18 19:34:11</gmt_changed>      </item>          <item>          <nid>626376</nid>          <type>image</type>          <title><![CDATA[Supersmarticle Based on Five Smarticles]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[smarticles-001.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/smarticles-001.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/smarticles-001.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/smarticles-001.jpg?itok=rGWOJ41q]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Supersmarticle composed of five smarticles]]></image_alt>                    <created>1568835406</created>          <gmt_created>2019-09-18 19:36:46</gmt_created>          <changed>1568835406</changed>          <gmt_changed>2019-09-18 19:36:46</gmt_changed>      </item>          <item>          <nid>626377</nid>          <type>image</type>          <title><![CDATA[Controlling a Supersmarticle]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[smarticles-002.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/smarticles-002.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/smarticles-002.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/smarticles-002.jpg?itok=u9GIrcvl]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Light controlling a supersmarticle]]></image_alt>                    <created>1568835521</created>          <gmt_created>2019-09-18 19:38:41</gmt_created>          <changed>1568835521</changed>          <gmt_changed>2019-09-18 19:38:41</gmt_changed>      </item>          <item>          <nid>626378</nid>          <type>image</type>          <title><![CDATA[Researchers with Smarticles]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[smarticles-004.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/smarticles-004.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/smarticles-004.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/smarticles-004.jpg?itok=YC3oaVwK]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researchers with smarticles]]></image_alt>                    <created>1568835643</created>          <gmt_created>2019-09-18 19:40:43</gmt_created>          <changed>1568835643</changed>          <gmt_changed>2019-09-18 19:40:43</gmt_changed>      </item>          <item>          <nid>626379</nid>          <type>image</type>          <title><![CDATA[Smarticle student researchers]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[murphey_smarticle.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/murphey_smarticle.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/murphey_smarticle.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/murphey_smarticle.jpg?itok=z0vEGfSh]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Students from Northwestern University]]></image_alt>                    <created>1568835760</created>          <gmt_created>2019-09-18 19:42:40</gmt_created>          <changed>1568835760</changed>          <gmt_changed>2019-09-18 19:42:40</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="377"><![CDATA[locomotion]]></keyword>          <keyword tid="182389"><![CDATA[smarticle]]></keyword>          <keyword tid="182390"><![CDATA[supersmarticle]]></keyword>          <keyword tid="181004"><![CDATA[emergent behavior]]></keyword>      </keywords>  <core_research_areas>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="625040">  <title><![CDATA[Scurrying Roaches Help Researchers Steady Staggering Robots]]></title>  <uid>31759</uid>  <body><![CDATA[<p>Ew, a cockroach! But it zips off before the swatter appears. Now, researchers have leveraged the bug&rsquo;s superb scurrying skills to create a cleverly simple method to assess and improve locomotion in robots.</p><p>Normally, tedious modeling of mechanics, electronics, and information science is required to understand how insects&rsquo; or robots&rsquo; moving parts coordinate smoothly to take them places. But&nbsp;<a href="https://www.nature.com/articles/s41467-019-11613-y" rel="noopener noreferrer" target="_blank">in a new study</a>, biomechanics researchers at the Georgia Institute of Technology boiled down the sprints of cockroaches to handy principles and equations they then used to make a test robot amble about better.</p><p>The method told the researchers about how each leg operates on its own, how they all come together as a whole, and the harmony or lack thereof in how they do it. Despite bugs&rsquo; and bots&rsquo; utterly divergent motion dynamics, the new method worked for both and should work for other robots and animals, too.</p><p>The biological robot, the roach, was the far superior runner with neurological signals guiding six impeccably evolved legs. The mechanical robot, a consumer model, had four stubby legs and no nervous system but relied instead for locomotion control on coarse physical forces traveling through its chassis as crude signals to roughly coordinate its clunky gait.</p><p>&ldquo;The robot was much bulkier and could hardly sense its environment. The cockroach had many senses and can adapt better to rough terrain. Bumps as high as its hips wouldn&rsquo;t slow it down at all,&rdquo; said&nbsp;<a href="https://www.linkedin.com/in/izaak-neveln-776315b2/" rel="noopener noreferrer" target="_blank">Izaak Neveln</a>, the study&rsquo;s first author, who was a postdoctoral researcher in the&nbsp;<a href="https://sponberg.gatech.edu/" rel="noopener noreferrer" target="_blank">lab of Simon Sponberg at Georgia Tech</a>during the study.</p><h4><strong>Advanced simplicity</strong>&nbsp;</h4><p>The method, or &ldquo;measure,&rdquo; as the study calls it, transcended these huge differences, which pervade animal-inspired robotics.</p><p>&ldquo;The measure is general (universal) in the sense that it can be used regardless of whether the signals are neural spiking patterns,&nbsp;<a href="https://www.lexico.com/en/definition/kinematics" rel="noopener noreferrer" target="_blank">kinematics</a>, voltages or forces and does not depend on the particular relationship between the signals,&rdquo; the study&rsquo;s authors wrote.</p><p>No matter how a bug or a bot functions, the measure&rsquo;s mathematical inputs and outputs are always in the same units. The measure will not always eliminate the need for modeling, but it stands to shorten and guide modeling and avert anguishing missteps.</p><p>The authors&nbsp;<a href="https://www.nature.com/articles/s41467-019-11613-y" rel="noopener noreferrer" target="_blank">published the study in the journal&nbsp;<em>Nature Communications</em></a>&nbsp;in August 2019. The research was funded by the National Science Foundation.&nbsp;<a href="https://sponberg.gatech.edu/" rel="noopener noreferrer" target="_blank">Sponberg is an assistant professor</a>&nbsp;in Georgia Tech&rsquo;s School of Physics and in the School of Biological Sciences.</p><h4><strong>Centralization vs. decentralization</strong></h4><p>Often a bot or an animal sends many walking signals through a central system to harmonize locomotion, but not all signals are centralized. Even in humans, though locomotion strongly depends on signals from the central nervous system, some neural signals are confined to regions of the body; they are localized signals.</p><p>Some insects appear to move with little centralization -- such as stick bugs, also known as walking sticks, whose legs prod about nearly independently. Stick bugs are&nbsp;<a href="https://www.youtube.com/watch?v=YxtWFd1yVoc" rel="noopener noreferrer" target="_blank">wonky runners</a>.</p><p>&ldquo;The idea has been that the stick bugs have the more localized control of motion, whereas a&nbsp;<a href="https://youtu.be/1ro6PNqkHEM?t=18" rel="noopener noreferrer" target="_blank">cockroach goes very fast</a>&nbsp;and needs to maintain stability, and its motion control is probably more centralized, more clocklike,&rdquo; Neveln said.</p><p>Strong centralization of signals generally coordinates locomotion better. Centralized signals&nbsp;could be code traveling through an elaborate robot&rsquo;s wiring, a cockroach&rsquo;s central neurons synching its legs, or the clunky robot&#39;s chassis tilting away from a leg thumping the ground thus putting weight onto an opposing leg.</p><p>Roboticists need to see through the differences and figure out the interplay of a locomotor&rsquo;s local and central signals.</p><p><sup><strong><em>[Ready for graduate school?&nbsp;<a href="http://www.gradadmiss.gatech.edu/apply-now" target="_blank">Here&#39;s how to apply to Georgia Tech.</a>]&nbsp;</em></strong></sup></p><h4><strong>Cool physics</strong></h4><p>The new &ldquo;measure&rdquo; does this by focusing on an overarching phenomenon in the walking legs, which&nbsp;<a href="https://images.app.goo.gl/BYTfJYwuZw53yJsT7" rel="noopener noreferrer" target="_blank">can be seen as pendula</a>&nbsp;moving back and forth. For great locomotion, they need to synch up in what is called phase-coupling oscillations.</p><p>A fun, easy experiment illustrates this physics principle. If a few, say six, metronomes &ndash; ticking rhythm pendula that piano teachers use -- are swinging out of sync, and you place them all on a platform that freely sways along with the metronomes&rsquo; swings,&nbsp;<a href="https://youtu.be/Aaxw4zbULMs?t=5" rel="noopener noreferrer" target="_blank">the swings will sync up in unison</a>.</p><p>The&nbsp;<a href="https://en.wikipedia.org/wiki/Phase_portrait#/media/File:Pendulum_phase_portrait_illustration.svg" rel="noopener noreferrer" target="_blank">phases, or directions, of their oscillations</a>&nbsp;are coupling with each other by centralizing their composite mechanical impulses through the platform. This particular example of phase-coupling is mechanical, but it can also be computational or neurological -- like in the roach.</p><p>Its legs would be analogous to the swinging metronomes, and central neuromuscular activity analogous to the free-swaying platform. In the roach, not all six legs swing in the same direction.</p><p>&ldquo;Their synchronization is not uniform. Three legs are synchronized in phase with each other -- the front and back legs of one side with the middle leg of the other side -- and those three are synchronized out of phase with the other three,&rdquo; Neveln said. &ldquo;It&rsquo;s an alternating tripod gait. One tripod of three legs alternates with the other tripod of three legs.&rdquo;</p><h4><strong>Useless pogoing</strong></h4><p>And just like pendula, each leg&rsquo;s swings can be graphed as a wave. All the legs&rsquo; waves can be averaged into an overall roach scurry wave and then developed into more useful math that relates centralization with decentralization and factors like&nbsp;<a href="https://www.lexico.com/en/definition/entropy" rel="noopener noreferrer" target="_blank">entropy</a>&nbsp;that can throw locomotion control off.</p><p>The resulting principles and math benefited the clunky robot, which has strong decentralized signals in its leg motors that react to leg contact with the ground, and centralized control weaker than that of the stick bug. The researchers graphed out the robot&#39;s&nbsp;movements, too, but they didn&#39;t result in the neatly synced group of waves that the cockroach had produced.</p><p>The researchers turned with the principles and math to the clunky robot, which initially was out of sorts -- bucking or hopping uselessly like a pogo stick. Then the scientists strengthened centralized control by re-weighting its chassis to make it move more coherently.</p><p>&ldquo;The metronomes on the platform are mechanical coupling, and our robot coordinates control that way,&rdquo; Neveln said. &ldquo;You can change the mechanical coupling of the robot by repositioning its weights. We were able to predict the changes this would make by using the measure we developed from the cockroach.&rdquo;</p><h4><strong>Cockroach surprises</strong></h4><p>The researchers also wired up specific roach muscles and neurons to observe their syncopations with the scurry waves. Seventeen cockroaches took 2,982 strides to inform the principles and math, and the bugs also sprung surprises on the researchers.</p><p>One stuck out: The scientists had thought signaling centralized more when the roach sped up, but instead, both central and local signaling strengthened, perhaps doubling down on the message: Run!</p><p><em>Georgia Tech&rsquo;s Amoolya Tiramulai coauthored the paper. The National Science Foundation funded the research (grant # NSF CAREER MPS/PoLS 1554790). Any findings, conclusions, and recommendations are those of the authors and not necessarily of the NSF.</em></p><p><strong>Writer &amp;&nbsp;Media Representative</strong>: Ben Brumfield (404-660-1408), email:&nbsp;<a href="mailto:ben.brumfield@comm.gatech.edu">ben.brumfield@comm.gatech.edu</a></p><p><strong>Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia &nbsp;30332-0181 &nbsp;USA</strong></p>]]></body>  <author>Ben Brumfield</author>  <status>1</status>  <created>1566506845</created>  <gmt_created>2019-08-22 20:47:25</gmt_created>  <changed>1566831962</changed>  <gmt_changed>2019-08-26 15:06:02</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have leveraged cockroaches' scurrying skills for a cleverly simple method to assess and improve locomotion in robots.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have leveraged cockroaches' scurrying skills for a cleverly simple method to assess and improve locomotion in robots.]]></sentence>  <summary><![CDATA[<p>To walk or run with finesse, roaches and&nbsp;robots coordinate leg movements via signals sent through centralized systems -- but&nbsp;utterly divergent ones. Despite their seemingly unbridgeable differences,&nbsp;researchers have devised handy principles and equations from studying roaches&nbsp;to assess how both beasts and bots locomote and to improve robotic gait.</p>]]></summary>  <dateline>2019-08-22T00:00:00-04:00</dateline>  <iso_dateline>2019-08-22T00:00:00-04:00</iso_dateline>  <gmt_dateline>2019-08-22 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>625034</item>          <item>625031</item>          <item>625035</item>      </media>  <hg_media>          <item>          <nid>625034</nid>          <type>image</type>          <title><![CDATA[Off-the-shelf robot with four legs]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Minotaur2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Minotaur2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Minotaur2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Minotaur2.jpg?itok=0mEyOe6D]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1566505226</created>          <gmt_created>2019-08-22 20:20:26</gmt_created>          <changed>1566831950</changed>          <gmt_changed>2019-08-26 15:05:50</gmt_changed>      </item>          <item>          <nid>625031</nid>          <type>image</type>          <title><![CDATA[The swings of cockroach legs as rough sine waves]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Screen Shot 2019-08-09 at 16.07.27.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Screen%20Shot%202019-08-09%20at%2016.07.27.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Screen%20Shot%202019-08-09%20at%2016.07.27.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Screen%2520Shot%25202019-08-09%2520at%252016.07.27.png?itok=_k5AiYxW]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1566505031</created>          <gmt_created>2019-08-22 20:17:11</gmt_created>          <changed>1566505031</changed>          <gmt_changed>2019-08-22 20:17:11</gmt_changed>      </item>          <item>          <nid>625035</nid>          <type>image</type>          <title><![CDATA[Cockroach Blaberus discoidalis]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Naturkundliche_Sammlung_Übermaxx_Überseemuseum_Bremen_0036.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Naturkundliche_Sammlung_U%CC%88bermaxx_U%CC%88berseemuseum_Bremen_0036.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Naturkundliche_Sammlung_U%CC%88bermaxx_U%CC%88berseemuseum_Bremen_0036.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Naturkundliche_Sammlung_U%25CC%2588bermaxx_U%25CC%2588berseemuseum_Bremen_0036.jpeg?itok=AMNt03Zl]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1566506115</created>          <gmt_created>2019-08-22 20:35:15</gmt_created>          <changed>1566506115</changed>          <gmt_changed>2019-08-22 20:35:15</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="182104"><![CDATA[Roach]]></keyword>          <keyword tid="182105"><![CDATA[Cockroach]]></keyword>          <keyword tid="182106"><![CDATA[phase-coupled oscillations]]></keyword>          <keyword tid="182107"><![CDATA[GAIT]]></keyword>          <keyword tid="7719"><![CDATA[walk]]></keyword>          <keyword tid="4285"><![CDATA[running]]></keyword>          <keyword tid="182108"><![CDATA[running ability]]></keyword>          <keyword tid="182109"><![CDATA[walking ability]]></keyword>          <keyword tid="182110"><![CDATA[Coupled-oscillator network]]></keyword>          <keyword tid="377"><![CDATA[locomotion]]></keyword>          <keyword tid="182111"><![CDATA[locomotor]]></keyword>          <keyword tid="182112"><![CDATA[locomotor instability]]></keyword>          <keyword tid="7738"><![CDATA[central nervous system]]></keyword>          <keyword tid="182113"><![CDATA[Information-based centralization]]></keyword>          <keyword tid="182114"><![CDATA[global control]]></keyword>          <keyword tid="182115"><![CDATA[local control]]></keyword>          <keyword tid="182116"><![CDATA[global signal]]></keyword>          <keyword tid="182117"><![CDATA[local signal]]></keyword>          <keyword tid="7121"><![CDATA[kinematics]]></keyword>          <keyword tid="2552"><![CDATA[robotic]]></keyword>          <keyword tid="182118"><![CDATA[centralization-decentralization axis]]></keyword>          <keyword tid="182119"><![CDATA[Kuramoto]]></keyword>          <keyword tid="182120"><![CDATA[mechanosensory feedback]]></keyword>          <keyword tid="181092"><![CDATA[Inertia]]></keyword>          <keyword tid="171924"><![CDATA[entropy]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="623759">  <title><![CDATA[Hackers Could Use Connected Cars to Gridlock Whole Cities]]></title>  <uid>31759</uid>  <body><![CDATA[<p>In the year 2026, at rush hour, your self-driving car abruptly shuts down right where it blocks traffic. You climb out to see gridlock down every street in view, then a news alert on your watch tells you that hackers have paralyzed all Manhattan traffic by randomly stranding internet-connected cars.</p><p>Flashback to July 2019, the dawn of autonomous vehicles and other connected cars, and physicists at the Georgia Institute of Technology and Multiscale Systems, Inc. have applied physics <a href="https://journals.aps.org/pre/abstract/10.1103/PhysRevE.100.012316" target="_blank"><strong>in a new study</strong></a> to simulate what it would take for future hackers to wreak exactly this widespread havoc by randomly stranding these cars. The researchers want to expand the current discussion on automotive cybersecurity, which mainly focuses on hacks that could <a href="https://money.cnn.com/technology/our-driverless-future/keep-hackers-out-of-your-driverless-car/" target="_blank">crash one car</a> or run over one pedestrian, to include potential mass mayhem.</p><p>They warn that even with increasingly tighter cyber defenses, the amount of data breached has soared in the past four years, but objects becoming hackable can convert the rising cyber threat into a potential physical menace.</p><p>&ldquo;Unlike most of the data breaches we hear about, hacked cars have physical consequences,&rdquo; said Peter Yunker, who co-led the study and is an&nbsp;<a href="https://www.physics.gatech.edu/user/peter-yunker" rel="noopener noreferrer" target="_blank">assistant professor in Georgia Tech&rsquo;s School of Physics</a>.</p><p>It may not be that hard for state, terroristic, or mischievous actors to commandeer parts of the internet of things, <a href="https://www.spectator.co.uk/2018/07/the-dream-of-driverless-cars-is-dying/" target="_blank">including cars</a>.</p><p>&ldquo;With cars, one of the worrying things is that currently there is effectively one central computing system, and a lot runs through it. You don&rsquo;t necessarily have separate systems to run your car and run your satellite radio. If you can get into one, you may be able to get into the other,&rdquo; said Jesse Silverberg of Multiscale Systems, Inc., who co-led the study with Yunker&nbsp;</p><h4><strong>Freezing traffic solid</strong></h4><p>In simulations of hacking internet-connected cars, the researchers froze traffic in Manhattan nearly solid, and it would not even take that to wreak havoc. Here are their results, and the numbers are conservative for reasons mentioned below.</p><p>&ldquo;Randomly stalling 20 percent of cars during rush hour would mean total traffic freeze. At 20 percent, the city has been broken up into small islands, where you may be able to inch around a few blocks, but no one would be able to move across town,&rdquo; said David Yanni, a graduate research assistant in Yunker&rsquo;s lab.</p><p>Not all cars on the road would have to be connected, just enough for hackers to stall 20 percent of all cars on the road. For example, if 40 percent of all cars on the road were connected, hacking half would suffice.</p><p>Hacking 10 percent of all cars at rush hour would debilitate traffic enough to prevent emergency vehicles from expediently cutting through traffic that is inching along citywide. The same thing would happen with a 20 percent hack during intermediate daytime traffic.</p><p>The researchers&rsquo; results appear <a href="https://journals.aps.org/pre/abstract/10.1103/PhysRevE.100.012316" target="_blank">in the journal&nbsp;<em>Physical Review E</em>&nbsp;on July 20, 2019</a>. The study is not embargoed.</p><p><sup><strong><em>[Ready for graduate school?&nbsp;<a href="http://www.gradadmiss.gatech.edu/apply-now" target="_blank">Here&#39;s how to apply to Georgia Tech.</a>]&nbsp;</em></strong></sup></p><h4><strong>It could take less</strong></h4><p>For the city to be safe, hacking damage would have to be below that. In other cities, things could be worse.</p><p>&ldquo;Manhattan has a nice grid, and that makes traffic more efficient. Looking at cities without large grids like Atlanta, Boston, or Los Angeles, and we think hackers could do worse harm because a grid makes you more robust with redundancies to get to the same places down many different routes,&rdquo; Yunker said.</p><p>The researchers left out factors that would likely worsen hacking damage, thus a real-world hack may require stalling even fewer cars to shut down Manhattan.</p><p>&ldquo;I want to emphasize that we only considered static situations &ndash; if roads are blocked or not blocked. In many cases, blocked roads spill over traffic into other roads, which we also did not include. If we were to factor in these other things, the number of cars you&rsquo;d have to stall would likely drop down significantly,&rdquo; Yunker said.</p><p>The researchers also did not factor in ensuing public panic nor car occupants becoming pedestrians that would further block streets or cause accidents. Nor did they consider hacks that would target cars at locations that maximize trouble.</p><p>They also stress that they are not cybersecurity experts, nor are they saying anything about the likelihood of someone carrying out such a hack. They simply want to give security experts a calculable idea of the scale of a hack that would shut a city down.</p><p>The researchers do have some general ideas of how to reduce the potential damage.</p><p>&ldquo;Split up the digital network influencing the cars to make it impossible to access too many cars through one network,&rdquo; said lead author Skanka Vivek, a postdoctoral researcher in Yunker&rsquo;s lab. &ldquo;If you could also make sure that cars next to each other can&rsquo;t be hacked at the same time that would decrease the risk of them blocking off traffic together.&rdquo;</p><h4><strong>Traffic jams as physics</strong></h4><p>Yunker researches in soft matter physics, which looks at how constituent parts &ndash; in this case, connected cars &ndash; act as one whole physical phenomenon. The research team analyzed the movements of cars on streets with varying numbers of lanes, including how they get around stalled vehicles and found they could apply a physics approach to what they observed.</p><p>&ldquo;Whether traffic is halted or not can be explained by classic percolation theory used in many different fields of physics and mathematics,&rdquo; Yunker said.</p><p><a href="https://en.wikipedia.org/wiki/Percolation_theory" rel="noopener noreferrer" target="_blank">Percolation theory</a>&nbsp;is often used in materials science to determine if a desirable quality like a specific rigidity will spread throughout a material to make the final product uniformly stable. In this case, stalled cars spread to make formerly flowing streets rigid and stuck.</p><p>The shut streets would be only those in which hacked cars have cut off all lanes or in which they have become hindrances that other cars can&rsquo;t maneuver around and do not include streets where hacked cars still allow traffic flow.</p><p>The researchers chose Manhattan for their simulations because a lot of data was available on that city&rsquo;s traffic patterns.</p><p><strong>Also READ: <a href="http://www.rh.gatech.edu/features/connected-new-world" target="_blank">Georgia Tech&#39;s cybersecurity researchers tackle the&nbsp;internet of things&nbsp;</a></strong></p><p><em>The study was coauthored by Skanda Vivek and David Yanni of Georgia Tech and Jesse Silverberg of Multiscale Systems, Inc. Any findings, conclusions, and recommendations are those of the authors.</em></p><p><strong>Writer &amp;&nbsp;Media Representative</strong>: Ben Brumfield (404-660-1408), email:&nbsp;<a href="mailto:ben.brumfield@comm.gatech.edu">ben.brumfield@comm.gatech.edu</a></p><p><strong>Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia &nbsp;30332-0181 &nbsp;USA</strong></p>]]></body>  <author>Ben Brumfield</author>  <status>1</status>  <created>1564413609</created>  <gmt_created>2019-07-29 15:20:09</gmt_created>  <changed>1564678483</changed>  <gmt_changed>2019-08-01 16:54:43</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Hackers could gridlock whole cities by stalling out a limited percentage of self-driving and other connected vehicles.]]></teaser>  <type>news</type>  <sentence><![CDATA[Hackers could gridlock whole cities by stalling out a limited percentage of self-driving and other connected vehicles.]]></sentence>  <summary><![CDATA[<p>In a future where&nbsp;self-driving and other internet-connected cars share the roads with the rest of us, hackers could not only wreck the occasional vehicle but possibly compound attacks to gridlock whole cities by stalling out a limited percentage of connected cars. Physicists calculated how many stalled cars would cause how much mayhem.</p>]]></summary>  <dateline>2019-07-29T00:00:00-04:00</dateline>  <iso_dateline>2019-07-29T00:00:00-04:00</iso_dateline>  <gmt_dateline>2019-07-29 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>623747</item>          <item>623752</item>          <item>623754</item>          <item>623760</item>          <item>623757</item>          <item>623758</item>      </media>  <hg_media>          <item>          <nid>623747</nid>          <type>image</type>          <title><![CDATA[Manhattan gridlock]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[New_York_City_Gridlock.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/New_York_City_Gridlock.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/New_York_City_Gridlock.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/New_York_City_Gridlock.jpg?itok=M_EL8Uhl]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1564409967</created>          <gmt_created>2019-07-29 14:19:27</gmt_created>          <changed>1564409967</changed>          <gmt_changed>2019-07-29 14:19:27</gmt_changed>      </item>          <item>          <nid>623752</nid>          <type>image</type>          <title><![CDATA[Gridlock Manhattan]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[New_York_City_Gridlock.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/New_York_City_Gridlock_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/New_York_City_Gridlock_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/New_York_City_Gridlock_0.jpg?itok=NM38RLoT]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1564410856</created>          <gmt_created>2019-07-29 14:34:16</gmt_created>          <changed>1564410856</changed>          <gmt_changed>2019-07-29 14:34:16</gmt_changed>      </item>          <item>          <nid>623754</nid>          <type>image</type>          <title><![CDATA[Stranded connected cars block traffic]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[blocking.scenario.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/blocking.scenario.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/blocking.scenario.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/blocking.scenario.jpg?itok=_3j9Gc3p]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1564411039</created>          <gmt_created>2019-07-29 14:37:19</gmt_created>          <changed>1564411039</changed>          <gmt_changed>2019-07-29 14:37:19</gmt_changed>      </item>          <item>          <nid>623760</nid>          <type>image</type>          <title><![CDATA[Hacked Manhattan grid maps]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Manhattan.hacked.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Manhattan.hacked.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Manhattan.hacked.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Manhattan.hacked.jpg?itok=YucWlvOQ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1564414826</created>          <gmt_created>2019-07-29 15:40:26</gmt_created>          <changed>1564414826</changed>          <gmt_changed>2019-07-29 15:40:26</gmt_changed>      </item>          <item>          <nid>623757</nid>          <type>image</type>          <title><![CDATA[Gridlock math]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[selfdriving.equation.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/selfdriving.equation.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/selfdriving.equation.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/selfdriving.equation.png?itok=lnXkEajL]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1564412526</created>          <gmt_created>2019-07-29 15:02:06</gmt_created>          <changed>1564412526</changed>          <gmt_changed>2019-07-29 15:02:06</gmt_changed>      </item>          <item>          <nid>623758</nid>          <type>image</type>          <title><![CDATA[Peter Yunker looking at territorial cholera strains]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Yunker.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Yunker.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Yunker.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Yunker.jpg?itok=nJGKLLqU]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1564412886</created>          <gmt_created>2019-07-29 15:08:06</gmt_created>          <changed>1564412886</changed>          <gmt_changed>2019-07-29 15:08:06</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>          <group id="126011"><![CDATA[School of Physics]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="142"><![CDATA[City Planning, Transportation, and Urban Growth]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="151"><![CDATA[Policy, Social Sciences, and Liberal Arts]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="142"><![CDATA[City Planning, Transportation, and Urban Growth]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="151"><![CDATA[Policy, Social Sciences, and Liberal Arts]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="171930"><![CDATA[self-driving]]></keyword>          <keyword tid="169008"><![CDATA[self-driving cars]]></keyword>          <keyword tid="181813"><![CDATA[self-driving car]]></keyword>          <keyword tid="181814"><![CDATA[self-driving simulation]]></keyword>          <keyword tid="98601"><![CDATA[hacking]]></keyword>          <keyword tid="181815"><![CDATA[Hackers]]></keyword>          <keyword tid="181816"><![CDATA[Percolation]]></keyword>          <keyword tid="181817"><![CDATA[percolation threshhold]]></keyword>          <keyword tid="167045"><![CDATA[simulation]]></keyword>          <keyword tid="181818"><![CDATA[cybersceurity]]></keyword>          <keyword tid="2200"><![CDATA[Cyber Attack]]></keyword>          <keyword tid="10840"><![CDATA[cyber attacks]]></keyword>          <keyword tid="181819"><![CDATA[cyber breaches]]></keyword>          <keyword tid="181820"><![CDATA[cyber campaigns]]></keyword>          <keyword tid="960"><![CDATA[physics]]></keyword>          <keyword tid="167858"><![CDATA[soft matter]]></keyword>          <keyword tid="181821"><![CDATA[soft matter physics]]></keyword>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>          <term tid="39471"><![CDATA[Materials]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="623453">  <title><![CDATA[Tiny Vibration-Powered Robots Are the Size of the World’s Smallest Ant]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Researchers have created a new type of tiny 3D-printed robot that moves by harnessing vibration from piezoelectric actuators, ultrasound sources or even tiny speakers. Swarms of these &ldquo;micro-bristle-bots&rdquo; might work together to sense environmental changes, move materials &ndash; or perhaps one day repair injuries inside the human body.</p><p>The prototype robots respond to different vibration frequencies depending on their configurations, allowing researchers to control individual bots by adjusting the vibration. Approximately two millimeters long &ndash; about the size of the world&rsquo;s smallest ant &ndash; the bots can cover four times their own length in a second despite the physical limitations of their small size.</p><p>&ldquo;We are working to make the technology robust, and we have a lot of potential applications in mind,&rdquo; said <a href="https://www.ece.gatech.edu/faculty-staff-directory/azadeh-ansari">Azadeh Ansari</a>, an assistant professor in the <a href="http://www.ece.gatech.edu">School of Electrical and Computer Engineering</a> at the Georgia Institute of Technology. &ldquo;We are working at the intersection of mechanics, electronics, biology and physics. It&rsquo;s a very rich area and there&rsquo;s a lot of room for multidisciplinary concepts.&rdquo;</p><p>A paper describing the micro-bristle-bots has been accepted for publication in the <em>Journal of Micromechanics and Microengineering</em>. The research was supported by a seed grant from Georgia Tech&rsquo;s Institute for Electronics and Nanotechnology. In addition to Ansari, the research team includes George W. Woodruff School of Mechanical Engineering Associate Professor Jun Ueda and graduate students DeaGyu Kim and Zhijian (Chris) Hao.</p><p>The micro-bristle-bots consist of a piezoelectric actuator glued onto a polymer body that is 3D-printed using two-photon polymerization lithography (TPP). The actuator generates vibration and is powered externally because no batteries are small enough to fit onto the bot. The vibrations can also come from a piezoelectric shaker beneath the surface on which the robots move, from an ultrasound/sonar source, or even from a tiny acoustic speaker.</p><p>The vibrations move the springy legs up and down, propelling the micro-bot forward. Each robot can be designed to respond to different vibration frequencies depending on leg size, diameter, design and overall geometry. The amplitude of the vibrations controls the speed at which the micro-bots move.&nbsp;</p><p>&ldquo;As the micro-bristle-bots move up and down, the vertical motion is translated into a directional movement by optimizing the design of the legs, which look like bristles,&rdquo; explained Ansari. &ldquo;The legs of the micro-robot are designed with specific angles that allow them to bend and move in one direction in resonant response to the vibration.&rdquo;</p><p>The micro-bristle-bots are made in a 3D printer using the TPP process, a technique that polymerizes a monomer resin material. Once the portion of the resin block struck by the ultraviolet light has been chemically developed, the remainder can be washed away, leaving the desired robotic structure.</p><p>&ldquo;It&rsquo;s writing rather than traditional lithography,&rdquo; Ansari explained. &ldquo;You are left with the structure that you write with a laser on the resin material. The process now takes quite a while, so we are looking at ways to scale it up to make hundreds or thousands of micro-bots at a time.&rdquo;</p><p>Some of the robots have four legs, while others have six. First author DeaGyu Kim made hundreds of the tiny structures to determine the ideal configuration.</p><p>The piezoelectric actuators, which use the material lead zirconate titanate (PZT), vibrate when electric voltage is applied to them. In reverse, they can also be used to generate a voltage, when they are vibrated, a capability the micro-bristle-bots could use to power up onboard sensors when they are actuated by external vibrations.</p><p>Ansari and her team are working to add steering capability to the robots by joining two slightly different micro-bristle-bots together. Because each of the joined micro-bots would respond to different vibration frequencies, the combination could be steered by varying the frequencies and amplitudes. &ldquo;Once you have a fully steerable micro-robot, you can imagine doing a lot of interesting things,&rdquo; she said.</p><p>Other researchers have worked on micro-robots that use magnetic fields to produce movement, Ansari noted. While that is useful for moving entire swarms at once, magnetic forces cannot easily be used to address individual robots within a swarm. The micro-bristle-bots created by Ansari and her team are believed to be the smallest robots powered by vibration.</p><p>The micro-bristle-bots are approximately two millimeters in length, 1.8 millimeters wide and 0.8 millimeters thick, and weigh about five milligrams. The 3D printer can produce smaller robots, but with a reduced mass, the adhesion forces between the tiny devices and a surface can get very large. Sometimes, the micro-bots cannot be separated from the tweezers used to pick them up.</p><p>Ansari and her team have built a &ldquo;playground&rdquo; in which multiple micro-bots can move around as the researchers learn more about what they can do. They are also interested in developing micro-bots that can jump and swim.</p><p>&ldquo;We can look at the collective behavior of ants, for example, and apply what we learn from them to our little robots,&rdquo; she added. &ldquo;These micro-bristle-bots walk nicely in a laboratory environment, but there is a lot more we will have to do before they can go out into the outside world.&rdquo;</p><p><em>The micro-bot fabrication was performed at the Georgia Tech Institute for Electronics and Nanotechnology, a member of the National Nanotechnology Coordinated Infrastructure, which is supported by the National Science Foundation through grant ECCS-1542173.</em></p><p><strong>CITATION</strong>: DeaGyu Kim, Zhijian Hao, Jun Ueda and Azadeh Ansari, &ldquo;A 5mg micro-bristle-bot fabricated by two-photon lithography&rdquo; (<em>Journal of Micromechanics and Microengineering</em>, 2019). <a href="https://doi.org/10.1088/1361-6439/ab309b">https://doi.org/10.1088/1361-6439/ab309b</a></p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986) (jtoon@gatech.edu).</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1563308690</created>  <gmt_created>2019-07-16 20:24:50</gmt_created>  <changed>1563308866</changed>  <gmt_changed>2019-07-16 20:27:46</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The size of an ant, the micro-bristle-bot moves by harnessing vibration.]]></teaser>  <type>news</type>  <sentence><![CDATA[The size of an ant, the micro-bristle-bot moves by harnessing vibration.]]></sentence>  <summary><![CDATA[<p>Researchers have created a new type of tiny 3D-printed robot that moves by harnessing vibration from piezoelectric actuators, ultrasound sources or even tiny speakers. Swarms of these &ldquo;micro-bristle-bots&rdquo; might work together to sense environmental changes, move materials &ndash; or perhaps one day repair injuries inside the human body.</p>]]></summary>  <dateline>2019-07-16T00:00:00-04:00</dateline>  <iso_dateline>2019-07-16T00:00:00-04:00</iso_dateline>  <gmt_dateline>2019-07-16 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>623446</item>          <item>623447</item>          <item>623448</item>          <item>623452</item>          <item>623449</item>          <item>623451</item>      </media>  <hg_media>          <item>          <nid>623446</nid>          <type>image</type>          <title><![CDATA[Micro-bristle-bot with penny]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[bristle-bot-011.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/bristle-bot-011.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/bristle-bot-011.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/bristle-bot-011.jpg?itok=gg0_1LmC]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Micro-bristle-bot shown with a penny]]></image_alt>                    <created>1563307246</created>          <gmt_created>2019-07-16 20:00:46</gmt_created>          <changed>1563307246</changed>          <gmt_changed>2019-07-16 20:00:46</gmt_changed>      </item>          <item>          <nid>623447</nid>          <type>image</type>          <title><![CDATA[Micro-bristle-bot close-up]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[bristle-bot-008.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/bristle-bot-008.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/bristle-bot-008.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/bristle-bot-008.jpg?itok=zDaw0H71]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Close-up of micro-bristle bot robot]]></image_alt>                    <created>1563307421</created>          <gmt_created>2019-07-16 20:03:41</gmt_created>          <changed>1563307421</changed>          <gmt_changed>2019-07-16 20:03:41</gmt_changed>      </item>          <item>          <nid>623448</nid>          <type>image</type>          <title><![CDATA[Micro-bristle-bot with penny-vert]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[bristle-bot-012.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/bristle-bot-012.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/bristle-bot-012.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/bristle-bot-012.jpg?itok=evI1d3PO]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Micro-bristle-bot shown with a penny]]></image_alt>                    <created>1563307543</created>          <gmt_created>2019-07-16 20:05:43</gmt_created>          <changed>1563307543</changed>          <gmt_changed>2019-07-16 20:05:43</gmt_changed>      </item>          <item>          <nid>623452</nid>          <type>image</type>          <title><![CDATA[Micro-bristle-bot team]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[bristle-bot-007.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/bristle-bot-007.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/bristle-bot-007.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/bristle-bot-007.jpg?itok=8iVQ-n1i]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Micro-bristle-bot research team]]></image_alt>                    <created>1563308009</created>          <gmt_created>2019-07-16 20:13:29</gmt_created>          <changed>1563308009</changed>          <gmt_changed>2019-07-16 20:13:29</gmt_changed>      </item>          <item>          <nid>623449</nid>          <type>image</type>          <title><![CDATA[Testing a micro-bristle-bot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[bristle-bot-005.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/bristle-bot-005.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/bristle-bot-005.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/bristle-bot-005.jpg?itok=GV5_DJL6]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Testing a micro-bristle-bot]]></image_alt>                    <created>1563307674</created>          <gmt_created>2019-07-16 20:07:54</gmt_created>          <changed>1563307674</changed>          <gmt_changed>2019-07-16 20:07:54</gmt_changed>      </item>          <item>          <nid>623451</nid>          <type>image</type>          <title><![CDATA[Microscope image of micro-bristle-bot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[bristle-bot-009.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/bristle-bot-009.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/bristle-bot-009.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/bristle-bot-009.jpg?itok=pG0MoAyC]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Micro-bristle-bot with penny under microscope]]></image_alt>                    <created>1563307896</created>          <gmt_created>2019-07-16 20:11:36</gmt_created>          <changed>1563307896</changed>          <gmt_changed>2019-07-16 20:11:36</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="181741"><![CDATA[micro-bristle-bot]]></keyword>          <keyword tid="13895"><![CDATA[Vibration]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="179119"><![CDATA[3D printed]]></keyword>          <keyword tid="7699"><![CDATA[piezoelectric]]></keyword>          <keyword tid="175301"><![CDATA[Azadeh Ansari]]></keyword>      </keywords>  <core_research_areas>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="622803">  <title><![CDATA[Georgia Tech Names Director for Georgia Tech Research Institute (GTRI)]]></title>  <uid>27303</uid>  <body><![CDATA[<p>The Georgia Institute of Technology has named James J. Hudgens to be the new director of the <a href="http://www.gtri.gatech.edu">Georgia Tech Research Institute</a> (GTRI), Georgia Tech&rsquo;s applied research division. Currently director of the Threat Intelligence Center (TIC) at Sandia National Laboratories in Albuquerque, New Mexico, Hudgens will become a Georgia Tech senior vice president and GTRI&rsquo;s director effective September 2, 2019.</p><p>Hudgens holds a Ph.D. in ceramic engineering from Iowa State University. He has led research and development programs in national security, cybersecurity, quantum information science, and photonic microsystems. He also led programs in data analytics, synthetic aperture radar, and airborne intelligence, surveillance and reconnaissance (ISR) systems before becoming director of the $265 million-per-year TIC, which has a staff of 550 professionals working in six states and 136 different laboratories.&nbsp;</p><p>A senior technology executive with 23 years of experience in national security research, Hudgens has also held positions at optical networking firm Mahi Networks, defense contractor Raytheon Electronic Systems, and semiconductor company Texas Instruments. In 2013, he won the Department of Energy Secretary&rsquo;s Honor Award for Achievement for leading the Copperhead counter-IED program.</p><p>&ldquo;Jim Hudgens has extensive experience building and leading federally sponsored programs that are at the center of GTRI&rsquo;s core research areas,&rdquo; said <a href="http://www.research.gatech.edu/meet-dr-chaouki-t-abdallah">Chaouki Abdallah</a>, Georgia Tech&rsquo;s Executive Vice President for Research. &ldquo;His experience developing and managing programs at Sandia National Laboratories and major private-sector defense contractors will support GTRI&rsquo;s continued growth in service to our nation&rsquo;s defense agencies and other important state and federal sponsors.&rdquo;</p><p>GTRI has more than 2,300 employees conducting nearly $500 million worth of research across a broad range of technology areas that focus on solving critical challenges for government and industry sponsors. GTRI is one of the world&rsquo;s leading applied research and development organizations, and is an integral part of Georgia Tech&rsquo;s research program.</p><p>&ldquo;Georgia Tech, through GTRI, is entrusted with a vital role in our national security,&rdquo; Hudgens said. &ldquo;I know firsthand that GTRI and other Georgia Tech researchers are known for the exceptional quality of their work in delivering innovative solutions to the most complex national security challenges.</p><p>&ldquo;It is a great privilege for me to join the combined University System of Georgia and Georgia Tech family to develop a shared vision for how we will build on this reputation to advance one of the nation&rsquo;s leading technological research universities,&rdquo; he added. &ldquo;I thank Georgia Tech President G.P. &ldquo;Bud&rdquo; Peterson, Provost Rafael Bras, and Executive Vice President Abdallah for the honor of becoming part of GTRI&rsquo;s 85-year legacy of service to the state of Georgia and our nation.&rdquo;</p><p>In congratulating Hudgens, Peterson emphasized GTRI&rsquo;s important role in the nation, region, state &ndash; and Georgia Tech itself.</p><p>&ldquo;For decades, the U.S. government and industry have looked to Georgia Tech &ndash; in particular GTRI &ndash; as they seek to find and develop effective, creative solutions in national security and other mission-critical areas,&rdquo; Peterson said. &ldquo;We are pleased to welcome Jim Hudgens to lead one of Georgia Tech&rsquo;s most important missions in support of our nation, region, and state.&rdquo;</p><p>Hudgens&rsquo; selection came after a five-month national search during which he was one of four finalists to make presentations to Georgia Tech faculty and staff.</p><p><a href="http://www.sandia.gov">Sandia National Laboratories</a> is a multi-mission laboratory operated for the U.S. Department of Energy&rsquo;s National Nuclear Security Administration. Sandia has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies, and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California. Sandia is the largest of the country&rsquo;s 17 national laboratories.</p><p>GTRI conducts research through eight laboratories located on Georgia Tech&rsquo;s midtown Atlanta campus, in a research facility near Dobbins Air Reserve Base in Smyrna, Georgia, and in Huntsville, Alabama. GTRI also has more than a dozen locations around the nation where it serves the needs of its research sponsors. GTRI&rsquo;s research spans a variety of disciplines, including autonomous systems, cybersecurity, electromagnetics, electronic warfare, modeling and simulation, sensors, systems engineering, test and evaluation, and threat systems.</p><p><strong>Media Relations Assistance</strong>: John Toon (404-894-6986) (jtoon@gatech.edu).</p><p><strong>Writer</strong>: John Toon</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1561633139</created>  <gmt_created>2019-06-27 10:58:59</gmt_created>  <changed>1561639851</changed>  <gmt_changed>2019-06-27 12:50:51</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The Georgia Institute of Technology has named James J. Hudgens to be the new director of the Georgia Tech Research Institute (GTRI), Georgia Tech’s applied research division. ]]></teaser>  <type>news</type>  <sentence><![CDATA[The Georgia Institute of Technology has named James J. Hudgens to be the new director of the Georgia Tech Research Institute (GTRI), Georgia Tech’s applied research division. ]]></sentence>  <summary><![CDATA[<p>The Georgia Institute of Technology has named James J. Hudgens to be the new director of the Georgia Tech Research Institute (GTRI), Georgia Tech&rsquo;s applied research division. Currently director of the Threat Intelligence Center (TIC) at Sandia National Laboratories in Albuquerque, New Mexico, Hudgens will become a Georgia Tech senior vice president and GTRI&rsquo;s director effective September 2, 2019.</p>]]></summary>  <dateline>2019-06-27T00:00:00-04:00</dateline>  <iso_dateline>2019-06-27T00:00:00-04:00</iso_dateline>  <gmt_dateline>2019-06-27 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>622802</item>          <item>622802</item>      </media>  <hg_media>          <item>          <nid>622802</nid>          <type>image</type>          <title><![CDATA[James J. Hudgens]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[james-hudgens-2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/james-hudgens-2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/james-hudgens-2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/james-hudgens-2.jpg?itok=BBP0oMxg]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[James J. Hudgens photo]]></image_alt>                    <created>1561632650</created>          <gmt_created>2019-06-27 10:50:50</gmt_created>          <changed>1561632650</changed>          <gmt_changed>2019-06-27 10:50:50</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="1366"><![CDATA[defense]]></keyword>          <keyword tid="181593"><![CDATA[James Hudgens]]></keyword>          <keyword tid="181594"><![CDATA[Jim Hudgens]]></keyword>          <keyword tid="525"><![CDATA[military]]></keyword>          <keyword tid="167571"><![CDATA[Sandia]]></keyword>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>          <term tid="39541"><![CDATA[Systems]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="622103">  <title><![CDATA[Slothbot Takes a Leisurely Approach to Environmental Monitoring]]></title>  <uid>27303</uid>  <body><![CDATA[<p>For environmental monitoring, precision agriculture, infrastructure maintenance and certain security applications, slow and energy efficient can be better than fast and always needing a recharge. That&rsquo;s where &ldquo;SlothBot&rdquo; comes in.</p><p>Powered by a pair of photovoltaic panels and designed to linger in the forest canopy continuously for months, SlothBot moves only when it must to measure environmental changes &ndash; such as weather and chemical factors in the environment &ndash; that can be observed only with a long-term presence. The proof-of-concept hyper-efficient robot, described May 21 at the International Conference on Robotics and Automation (ICRA) in Montreal, may soon be hanging out among treetop cables in the Atlanta Botanical Garden.</p><p>&ldquo;In robotics, it seems we are always pushing for faster, more agile and more extreme robots,&rdquo; said <a href="https://www.ece.gatech.edu/faculty-staff-directory/magnus-egerstedt-0">Magnus Egerstedt</a>, the Steve W. Chaddick School Chair of the School of Electrical and Computer Engineering at the Georgia Institute of Technology and principal investigator for Slothbot. &ldquo;But there are many applications where there is no need to be fast. You just have to be out there persistently over long periods of time, observing what&rsquo;s going on.&rdquo;</p><p>Based on what Egerstedt called the &ldquo;theory of slowness,&rdquo; Graduate Research Assistant Gennaro Notomista designed SlothBot together with his colleague, Yousef Emam, using 3D-printed parts for the gearing and wire-switching mechanisms needed to crawl through a network of wires in the trees. The greatest challenge for a wire-crawling robot is switching from one cable to another without falling, Notomista said.</p><p>&ldquo;The challenge is smoothly holding onto one wire while grabbing another,&rdquo; he said. &ldquo;It&rsquo;s a tricky maneuver and you have to do it right to provide a fail-safe transition. Making sure the switches work well over long periods of time is really the biggest challenge.&rdquo;</p><p>Mechanically, SlothBot consists of two bodies connected by an actuated hinge. Each body houses a driving motor connected to a rim on which a tire is mounted. The use of wheels for locomotion is simple, energy efficient and safer than other types of wire-based locomotion, the researchers say.</p><p>SlothBot has so far operated in a network of cables on the Georgia Tech campus. Next, a new 3D-printed shell &ndash; that makes the robot look more like a sloth &ndash; will protect the motors, gears, actuators, cameras, computer and other components from the rain and wind. That will set the stage for longer-term studies in the tree canopy at the Atlanta Botanical Garden, where Egerstedt hopes visitors will see a SlothBot monitoring conditions as early as this fall.</p><p>The name SlothBot is not a coincidence. Real-life sloths are small mammals that live in jungle canopies of South and Central America. Making their living by eating tree leaves, the animals can survive on the daily caloric equivalent of a small potato. With their slow metabolism, sloths rest as much 22 hours a day and seldom descend from the trees where they can spend their entire lives.</p><p>&ldquo;The life of a sloth is pretty slow-moving and there&rsquo;s not a lot of excitement on a day-to-day level,&rdquo; said Jonathan Pauli, an associate professor in the Department of Forest &amp; Wildlife Ecology at the University of Wisconsin-Madison, who has consulted with the Georgia Tech team on the project. &ldquo;The nice thing about a very slow life history is that you don&rsquo;t really need a lot of energy input. You can have a long duration and persistence in a limited area with very little energy inputs over a long period of time.&rdquo;</p><p>That&rsquo;s exactly what the researchers expect from SlothBot, whose development has been funded by the U.S. Office of Naval Research.</p><p>&ldquo;There is a lot we don&rsquo;t know about what actually happens under dense tree-covered areas,&rdquo; Egerstedt said. &ldquo;Most of the time SlothBot will be just hanging out there, and every now and then it will move into a sunny spot to recharge the battery.&rdquo;</p><p>The researchers also hope to test SlothBot in a cacao plantation in Costa Rica that is already home to real sloths. &ldquo;The cables used to move cacao have become a sloth superhighway because the animals find them useful to move around,&rdquo; Egerstedt said. &ldquo;If all goes well, we will deploy SlothBots along the cables to monitor the sloths.&rdquo;</p><p>Egerstedt is known for algorithms that drive swarms of small wheeled or flying robots. But during a visit to Costa Rica, he became interested in sloths and began developing what he calls &ldquo;a theory of slowness&rdquo; together with Professor Ron Arkin in Georgia Tech&rsquo;s School of Interactive Computing. The theory leverages the benefits of energy efficiency.</p><p>&ldquo;If you are doing things like environmental monitoring, you want to be out in the forest for months,&rdquo; Egerstedt said. &ldquo;That changes the way you think about control systems at a high level.&rdquo;</p><p>Flying robots are already used for environmental monitoring, but their high energy needs mean they cannot linger for long. Wheeled robots can get by with less energy, but they can get stuck in mud or be hampered by tree roots, and cannot get a big picture view from the ground.</p><p>&ldquo;The thing that costs energy more than anything else is movement,&rdquo; Egerstedt said. &ldquo;Moving is much more expensive than sensing or thinking. For environmental robots, you should only move when you absolutely have to. We had to think about what that would be like.&rdquo;</p><p>For Pauli, who studies a variety of wildlife, working with Egerstedt to help SlothBot come to life has been gratifying.</p><p>&ldquo;It is great to see a robot inspired by the biology of sloths,&rdquo; he said. &ldquo;It has been fun to share how sloths and other organisms that live in these ecosystems for long periods of time live their lives. It will be interesting to see robots mirroring what we see in natural ecological communities.&rdquo;</p><p><em>This research was sponsored by the U.S. Office of Naval Research through Grant N00014-15-2115. The content is solely the responsibility of the authors and does not necessarily represent the official views of the ONR.</em></p><p><strong>CITATION</strong>: &quot;The SlothBot: A Novel Design for a Wire-Traversing Robot,&quot; IEEE Robotics and Automation Letters, (Volume 4, Issue 2, April 2019)<em>&nbsp;</em><a href="https://ieeexplore.ieee.org/document/8642808">https://ieeexplore.ieee.org/document/8642808</a></p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986) (jtoon@gatech.edu)</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1559241883</created>  <gmt_created>2019-05-30 18:44:43</gmt_created>  <changed>1559584944</changed>  <gmt_changed>2019-06-03 18:02:24</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Slow and energy-efficient SlothBot will handle environmental monitoring and other tasks.]]></teaser>  <type>news</type>  <sentence><![CDATA[Slow and energy-efficient SlothBot will handle environmental monitoring and other tasks.]]></sentence>  <summary><![CDATA[<p>For environmental monitoring, precision agriculture, infrastructure maintenance and certain security applications, slow and energy efficient can be better than fast and always needing a recharge. That&rsquo;s where &ldquo;SlothBot&rdquo; comes in.</p>]]></summary>  <dateline>2019-05-30T00:00:00-04:00</dateline>  <iso_dateline>2019-05-30T00:00:00-04:00</iso_dateline>  <gmt_dateline>2019-05-30 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>622097</item>          <item>622098</item>          <item>622099</item>          <item>622101</item>          <item>622102</item>      </media>  <hg_media>          <item>          <nid>622097</nid>          <type>image</type>          <title><![CDATA[SlothBot on a cable]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[slothbot-005.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/slothbot-005.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/slothbot-005.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/slothbot-005.jpg?itok=4FDTL8g2]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Gennaro Notomista with SlothBot]]></image_alt>                    <created>1559241086</created>          <gmt_created>2019-05-30 18:31:26</gmt_created>          <changed>1559241086</changed>          <gmt_changed>2019-05-30 18:31:26</gmt_changed>      </item>          <item>          <nid>622098</nid>          <type>image</type>          <title><![CDATA[SlothBot on a cable - 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[slothbot-001.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/slothbot-001.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/slothbot-001.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/slothbot-001.jpg?itok=WaEHJnBJ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[SlothBot, robot, cable, monitoring]]></image_alt>                    <created>1559241184</created>          <gmt_created>2019-05-30 18:33:04</gmt_created>          <changed>1559241184</changed>          <gmt_changed>2019-05-30 18:33:04</gmt_changed>      </item>          <item>          <nid>622099</nid>          <type>image</type>          <title><![CDATA[Sloth moving along a cable]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[two-toed.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/two-toed.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/two-toed.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/two-toed.jpg?itok=47RKCQSx]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[SlothBot, robot, cable, monkitoring]]></image_alt>                    <created>1559241292</created>          <gmt_created>2019-05-30 18:34:52</gmt_created>          <changed>1559241292</changed>          <gmt_changed>2019-05-30 18:34:52</gmt_changed>      </item>          <item>          <nid>622101</nid>          <type>image</type>          <title><![CDATA[Components of SlothBot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[slothbot-007.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/slothbot-007.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/slothbot-007.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/slothbot-007.jpg?itok=8W-wfpcl]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Components of SlothBot]]></image_alt>                    <created>1559241403</created>          <gmt_created>2019-05-30 18:36:43</gmt_created>          <changed>1559241403</changed>          <gmt_changed>2019-05-30 18:36:43</gmt_changed>      </item>          <item>          <nid>622102</nid>          <type>image</type>          <title><![CDATA[Components of SlothBot - 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[slothbot-009.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/slothbot-009.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/slothbot-009.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/slothbot-009.jpg?itok=HdfhFEun]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[SlothBot, robot, environmental monitoring]]></image_alt>                    <created>1559241487</created>          <gmt_created>2019-05-30 18:38:07</gmt_created>          <changed>1559241487</changed>          <gmt_changed>2019-05-30 18:38:07</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="181413"><![CDATA[SlothBot]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="103651"><![CDATA[environmental monitoring]]></keyword>          <keyword tid="181414"><![CDATA[energy-efficient]]></keyword>      </keywords>  <core_research_areas>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="620260">  <title><![CDATA[Researchers Awarded $6.25 Million to Study Collective Emergent Behavior]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Georgia Tech researchers have been awarded $6.25 million from the Department of Defense (DoD) to use collective emergent behavior to achieve task-oriented objectives.&nbsp;</p><p>DoD&rsquo;s Multidisciplinary University Research Initiatives (MURI) Program funds projects that bring researchers together from diverse backgrounds to work on a complex problem. I<a href="http://ideas.gatech.edu/">nstitute for Data Engineering and Science </a>co-director, Professor <a href="http://people.math.gatech.edu/~randall/">Dana Randall</a>, is project investigator and leads a team of six that includes <a href="https://www.physics.gatech.edu/user/daniel-goldman">Daniel Goldman</a>, Dunn Family Professor in the School of Physics. The Formal Foundations of Algorithmic Matter and Emergent Computation team also includes chemical engineering, mechanical engineering, physics, and computational science researchers from other universities.&nbsp;&nbsp;</p><p>The researchers are trying to predict and design emergent behavior within computation by using basic algorithms on simple machines to perform complex tasks. Emergent behavior is when a microscopic change in a parameter creates a macroscopic change to a system. This collective behavior is easy to find in nature, from a swarm of bees to a colony of ants, but also appears in other scientific disciplines.&nbsp;</p><p>&ldquo;A MURI lets us take a deep dive toward understanding how many computationally limited components at the micro-scale can be programmed to work collectively to produce useful behavior at the macro-scale,&rdquo; said Randall, who is also the ADVANCE Professor of Computing. &ldquo;Our interdisciplinary team combines expertise in many fields, mimicking the research by forming a collaboration that is also greater than the sum of its parts.&quot;</p><p>The MURI hybrid approach to algorithmic matter combines traditional logic-based programming with non-traditional computational methods, such as using physical characteristics of the interacting matter to drive a system toward collective behavior. One of the goals is to program based on this predictable emergent behavior. The approach also predicts basic properties of the collective&rsquo;s emergent behavior, like whether it will behave like a gas, fluid, or solid. In this context, emergent behavior turns into emergent collective computation.</p><p>&ldquo;MURI promises basic algorithms that allow very simple machines to work collectively to perform amazingly complex tasks,&rdquo; Massachusetts Institute of Technology (MIT) chemical engineering Professor <a href="https://srg.mit.edu/">Michael Strano</a> said. &ldquo;Our team will examine systems of autonomous cell-like particles that interact and respond to the movement of their neighbors in a programmable way. Theorists will be able to test ideas of emergent computation from these simple devices and learn how to execute tasks from the behavior of relatively simple, autonomous particles.&rdquo;</p><p>Although the behavior has footing in physics, computer science, and swarm robotics, there is no underlying framework to explain why until this research. The multidisciplinary approach allows theory and experiment to continuously inform each other and determine the computational capabilities of emergent behavior. The team has an ideal range of expertise in machine learning, control theory, and non-equilibrium physics and algorithms. They are also working with experimentalists who build collective systems at granular and microscopic scales.&nbsp;&nbsp;</p><p>&ldquo;An exciting aspect of this collaboration will be our attempts to interface and integrate ideas and tools from robotics, non-equilibrium physics, control theory, and computer science to develop task-capable swarms,&rdquo; Goldman said.</p><p>This MURI project will run for five years and is funded by the Army Research Office. In addition to Randall, Goldman, and Strano, the team also includes Arizona State computational science and engineering Professor Andrea Richa, MIT physics Associate Professor Jeremy England, and Northwestern mechanical engineering Professor Todd Murphey.</p><p>The overarching goal is to find how simplistic the computation can be for this complexity. This could lead to advances in engineered systems achieving specific task-oriented goals.</p><p>&ldquo;The MURI promises nothing short of the transformation of robots,&rdquo; Strano said, &ldquo;from the large, bulky constructions that we think of today, to future clouds or swarms that enable functions that are currently impossible to realize.&rdquo;</p><p><strong>Writer</strong>: Tess Malone</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1554855317</created>  <gmt_created>2019-04-10 00:15:17</gmt_created>  <changed>1554855375</changed>  <gmt_changed>2019-04-10 00:16:15</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have been awarded $6.25 million to use collective emergent behavior.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have been awarded $6.25 million to use collective emergent behavior.]]></sentence>  <summary><![CDATA[<p>Georgia Tech researchers have been awarded $6.25 million from the Department of Defense (DoD) to use collective emergent behavior to achieve task-oriented objectives.&nbsp;</p>]]></summary>  <dateline>2019-04-09T00:00:00-04:00</dateline>  <iso_dateline>2019-04-09T00:00:00-04:00</iso_dateline>  <gmt_dateline>2019-04-09 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[tess.malone@cc.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Tess Malone</p><p>College of Computing</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>620256</item>          <item>620257</item>          <item>620258</item>          <item>620259</item>      </media>  <hg_media>          <item>          <nid>620256</nid>          <type>image</type>          <title><![CDATA[Vibrating robots with magnetic interactions]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[emergent-behavior-003.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/emergent-behavior-003.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/emergent-behavior-003.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/emergent-behavior-003.jpg?itok=7stcjdnU]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Vibrating robots use magnetic interaction]]></image_alt>                    <created>1554854240</created>          <gmt_created>2019-04-09 23:57:20</gmt_created>          <changed>1554854240</changed>          <gmt_changed>2019-04-09 23:57:20</gmt_changed>      </item>          <item>          <nid>620257</nid>          <type>image</type>          <title><![CDATA[Mimicking ferromagnetic materials]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[emergent-behavior-007.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/emergent-behavior-007.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/emergent-behavior-007.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/emergent-behavior-007.jpg?itok=HINoLXYg]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Collection of vibrating robots]]></image_alt>                    <created>1554854384</created>          <gmt_created>2019-04-09 23:59:44</gmt_created>          <changed>1554854384</changed>          <gmt_changed>2019-04-09 23:59:44</gmt_changed>      </item>          <item>          <nid>620258</nid>          <type>image</type>          <title><![CDATA[Researchers for MURI]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[emergent-behavior-015.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/emergent-behavior-015.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/emergent-behavior-015.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/emergent-behavior-015.jpg?itok=5258iMXB]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MURI researchers]]></image_alt>                    <created>1554854549</created>          <gmt_created>2019-04-10 00:02:29</gmt_created>          <changed>1554854549</changed>          <gmt_changed>2019-04-10 00:02:29</gmt_changed>      </item>          <item>          <nid>620259</nid>          <type>image</type>          <title><![CDATA[Researchers for MURI-2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[emergent-behavior-016.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/emergent-behavior-016.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/emergent-behavior-016.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/emergent-behavior-016.jpg?itok=ZCqM9J_Y]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MURI researchers]]></image_alt>                    <created>1554854661</created>          <gmt_created>2019-04-10 00:04:21</gmt_created>          <changed>1554854661</changed>          <gmt_changed>2019-04-10 00:04:21</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="545781"><![CDATA[Institute for Data Engineering and Science]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="143"><![CDATA[Digital Media and Entertainment]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="143"><![CDATA[Digital Media and Entertainment]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="181004"><![CDATA[emergent behavior]]></keyword>          <keyword tid="181005"><![CDATA[collective behavior]]></keyword>          <keyword tid="24211"><![CDATA[MURI]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="181009"><![CDATA[vibrating robot]]></keyword>          <keyword tid="3167"><![CDATA[algorithm]]></keyword>          <keyword tid="10467"><![CDATA[Dana Randall]]></keyword>          <keyword tid="47881"><![CDATA[Dan Goldman]]></keyword>      </keywords>  <core_research_areas>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="619317">  <title><![CDATA[Seeing through a Robot’s Eyes Helps Those with Profound Motor Impairments]]></title>  <uid>27303</uid>  <body><![CDATA[<p>An interface system that uses augmented reality technology could help individuals with profound motor impairments operate a humanoid robot to feed themselves and perform routine personal care tasks such as scratching an itch and applying skin lotion. The web-based interface displays a &ldquo;robot&rsquo;s eye view&rdquo; of surroundings to help users interact with the world through the machine.</p><p>The system, described March 15 in the journal <em>PLOS ONE</em>, could help make sophisticated robots more useful to people who do not have experience operating complex robotic systems. Study participants interacted with the robot interface using standard assistive computer access technologies &mdash; such as eye trackers and head trackers &mdash; that they were already using to control their personal computers.</p><p>The paper reported on two studies showing how such &ldquo;robotic body surrogates&rdquo; &ndash; which can perform tasks similar to those of humans &ndash; could improve the quality of life for users. The work could provide a foundation for developing faster and more capable assistive robots.</p><p>&ldquo;Our results suggest that people with profound motor deficits can improve their quality of life using robotic body surrogates,&rdquo; said Phillip Grice, a recent Georgia Institute of Technology Ph.D. graduate who is first author of the paper. &ldquo;We have taken the first step toward making it possible for someone to purchase an appropriate type of robot, have it in their home and derive real benefit from it.&rdquo;</p><p>Grice and Professor <a href="https://www.bme.gatech.edu/bme/faculty/Charlie-Kemp">Charlie Kemp</a> from the <a href="https://www.bme.gatech.edu/">Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University</a> used a PR2 mobile manipulator manufactured by Willow Garage for the two studies. The wheeled robot has 20 degrees of freedom, with two arms and a &ldquo;head,&rdquo; giving it the ability to manipulate objects such as water bottles, washcloths, hairbrushes and even an electric shaver.</p><p>&ldquo;Our goal is to give people with limited use of their own bodies access to robotic bodies so they can interact with the world in new ways,&rdquo; said Kemp.</p><p>In their first study, Grice and Kemp made the PR2 available across the internet to a group of 15 participants with severe motor impairments. The participants learned to control the robot remotely, using their own assistive equipment to operate a mouse cursor to perform a personal care task. Eighty percent of the participants were able to manipulate the robot to pick up a water bottle and bring it to the mouth of a mannequin.</p><p>&ldquo;Compared to able-bodied persons, the capabilities of the robot are limited,&rdquo; Grice said. &ldquo;But the participants were able to perform tasks effectively and showed improvement on a clinical evaluation that measured their ability to manipulate objects compared to what they would have been able to do without the robot.&rdquo;</p><p>In the second study, the researchers provided the PR2 and interface system to Henry Evans, a California man who has been helping Georgia Tech researchers study and improve assistive robotic systems since 2011. Evans, who has very limited control of his body, tested the robot in his home for seven days and not only completed tasks, but also devised novel uses combining the operation of both robot arms at the same time &ndash; using one arm to control a washcloth and the other to use a brush.</p><p>&ldquo;The system was very liberating to me, in that it enabled me to independently manipulate my environment for the first time since my stroke,&rdquo; said Evans. &ldquo;With respect to other people, I was thrilled to see Phil get overwhelmingly positive results when he objectively tested the system with 15 other people.&rdquo;</p><p>The researchers were pleased that Evans developed new uses for the robot, combining motion of the two arms in ways they had not expected.</p><p>&ldquo;When we gave Henry free access to the robot for a week, he found new opportunities for using it that we had not anticipated,&rdquo; said Grice. &ldquo;This is important because a lot of the assistive technology available today is designed for very specific purposes. What Henry has shown is that this system is powerful in providing assistance and empowering users. The opportunities for this are potentially very broad.&rdquo;</p><p>The interface allowed Evans to care for himself in bed over an extended period of time. &ldquo;The most helpful aspect of the interface system was that I could operate the robot completely independently, with only small head movements using an extremely intuitive graphical user interface,&rdquo; Evans said.</p><p>The web-based interface shows users what the world looks like from cameras located in the robot&rsquo;s head. Clickable controls overlaid on the view allow the users to move the robot around in a home or other environment and control the robot&rsquo;s hands and arms. When users move the robot&rsquo;s head, for instance, the screen displays the mouse cursor as a pair of eyeballs to show where the robot will look when the user clicks. Clicking on a disc surrounding the robotic hands allows users to select a motion. While driving the robot around a room, lines following the cursor on the interface indicate the direction it will travel.</p><p>Building the interface around the actions of a simple single-button mouse allows people with a range of disabilities to use the interface without lengthy training sessions.</p><p>&ldquo;Having an interface that individuals with a wide range of physical impairments can operate means we can provide access to a broad range of people, a form of universal design,&rdquo; Grice noted. &ldquo;Because of its capability, this is a very complex system, so the challenge we had to overcome was to make it accessible to individuals who have very limited control of their own bodies.&rdquo;</p><p>While the results of the study demonstrated what the researchers had set out to do, Kemp agrees that improvements can be made. The existing system is slow, and mistakes made by users can create significant setbacks. Still, he said, &ldquo;People could use this technology today and really benefit from it.&rdquo;</p><p>The cost and size of the PR2 would need to be significantly reduced for the system to be commercially viable, Evans suggested. Kemp says these studies point the way to a new type of assistive technology.&nbsp;</p><p>&ldquo;It seems plausible to me based on this study that robotic body surrogates could provide significant benefits to users,&rdquo; Kemp added.</p><p><em>This work was supported by the National Institute on Disability, Independent Living, and Rehabilitation Research (NIDILRR), grant 90RE5016-01-00 via RERC TechSAge, National Science Foundation Award IIS-1150157, by a National Science Foundation Graduate Research Fellowship Program Award, and the Residential Care Facilities for the Elderly of Fulton County Scholar Award.&nbsp;</em></p><p><em>Kemp is a cofounder, a board member, an equity holder, and the CTO of Hello Robot Inc., which is developing products related to this research. This research could affect his personal financial status. The terms of this arrangement have been reviewed and approved by Georgia Tech in accordance with its conflict of interest policies.</em></p><p><strong>CITATION</strong>: Phillip M. Grice and Charles C. Kemp, &ldquo;In-home and remote use of robotic body surrogates by people with profound motor deficits&rdquo; (PLOS ONE 2019).&nbsp;<a href="https://doi.org/10.1371/journal.pone.0212904">https://doi.org/10.1371/journal.pone.0212904</a></p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta,Georgia&nbsp; 30332-0171&nbsp; USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986) (jtoon@gatech.edu).</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1552673872</created>  <gmt_created>2019-03-15 18:17:52</gmt_created>  <changed>1553198438</changed>  <gmt_changed>2019-03-21 20:00:38</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[An interface system that uses augmented reality technology could help individuals with profound motor impairments operate a humanoid robot to feed themselves and perform routine personal care tasks.]]></teaser>  <type>news</type>  <sentence><![CDATA[An interface system that uses augmented reality technology could help individuals with profound motor impairments operate a humanoid robot to feed themselves and perform routine personal care tasks.]]></sentence>  <summary><![CDATA[<p>An interface system that uses augmented reality technology could help individuals with profound motor impairments operate a humanoid robot to feed themselves and perform routine personal care tasks such as scratching an itch and applying skin lotion. The web-based interface displays a &ldquo;robot&rsquo;s eye view&rdquo; of surroundings to help users interact with the world through the machine.</p>]]></summary>  <dateline>2019-03-15T00:00:00-04:00</dateline>  <iso_dateline>2019-03-15T00:00:00-04:00</iso_dateline>  <gmt_dateline>2019-03-15 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>619310</item>          <item>619311</item>          <item>619312</item>          <item>619313</item>          <item>619315</item>      </media>  <hg_media>          <item>          <nid>619310</nid>          <type>image</type>          <title><![CDATA[Controlling the PR2 Robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[PR2-controls.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/PR2-controls.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/PR2-controls.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/PR2-controls.png?itok=J2VTbGi_]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[View used to control the PR2 robot]]></image_alt>                    <created>1552672836</created>          <gmt_created>2019-03-15 18:00:36</gmt_created>          <changed>1552672836</changed>          <gmt_changed>2019-03-15 18:00:36</gmt_changed>      </item>          <item>          <nid>619311</nid>          <type>image</type>          <title><![CDATA[Retrieving a cup with the robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[PR2-cup.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/PR2-cup.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/PR2-cup.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/PR2-cup.png?itok=WDBhy1H-]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[PR2 robot retrieves a cup.]]></image_alt>                    <created>1552672973</created>          <gmt_created>2019-03-15 18:02:53</gmt_created>          <changed>1552672973</changed>          <gmt_changed>2019-03-15 18:02:53</gmt_changed>      </item>          <item>          <nid>619312</nid>          <type>image</type>          <title><![CDATA[Henry Evans shaving with the robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[PR2-shaving.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/PR2-shaving.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/PR2-shaving.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/PR2-shaving.png?itok=3ouwR15t]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Shaving with a robot]]></image_alt>                    <created>1552673119</created>          <gmt_created>2019-03-15 18:05:19</gmt_created>          <changed>1552673119</changed>          <gmt_changed>2019-03-15 18:05:19</gmt_changed>      </item>          <item>          <nid>619313</nid>          <type>image</type>          <title><![CDATA[PR2 robot arm]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[PR2-arm.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/PR2-arm.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/PR2-arm.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/PR2-arm.png?itok=Q84dWRYp]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[PR2 robot arm]]></image_alt>                    <created>1552673270</created>          <gmt_created>2019-03-15 18:07:50</gmt_created>          <changed>1552673270</changed>          <gmt_changed>2019-03-15 18:07:50</gmt_changed>      </item>          <item>          <nid>619315</nid>          <type>image</type>          <title><![CDATA[PR2 humanoid robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[PR2-robot.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/PR2-robot.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/PR2-robot.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/PR2-robot.jpg?itok=nZo731_O]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[robot, humanoid robot, interface, augmented reality, PR2]]></image_alt>                    <created>1552673390</created>          <gmt_created>2019-03-15 18:09:50</gmt_created>          <changed>1552673390</changed>          <gmt_changed>2019-03-15 18:09:50</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1597"><![CDATA[Augmented Reality]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="180814"><![CDATA[humanoid robot]]></keyword>          <keyword tid="2815"><![CDATA[interface]]></keyword>          <keyword tid="180816"><![CDATA[PR2 Charlie Kemp]]></keyword>          <keyword tid="180817"><![CDATA[robotic body surrogate]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="618865">  <title><![CDATA[Ultra-Low Power Chips Help Make Small Robots More Capable]]></title>  <uid>27303</uid>  <body><![CDATA[<p>An ultra-low power hybrid chip inspired by the brain could help give palm-sized robots the ability to collaborate and learn from their experiences. Combined with new generations of low-power motors and sensors, the new application-specific integrated circuit (ASIC) &ndash; which operates on milliwatts of power &ndash; could help intelligent swarm robots operate for hours instead of minutes.</p><p>To conserve power, the chips use a hybrid digital-analog time-domain processor in which the pulse-width of signals encodes information. The neural network IC accommodates both model-based programming and collaborative reinforcement learning, potentially providing the small robots larger capabilities for reconnaissance, search-and-rescue and other missions.</p><p>Researchers from the Georgia Institute of Technology demonstrated robotic cars driven by the unique ASICs at the 2019 IEEE International Solid-State Circuits Conference (ISSCC). The research was sponsored by the Defense Advanced Research Projects Agency (DARPA) and the Semiconductor Research Corporation (SRC) through the Center for Brain-inspired Computing Enabling Autonomous Intelligence (CBRIC).</p><p>&ldquo;We are trying to bring intelligence to these very small robots so they can learn about their environment and move around autonomously, without infrastructure,&rdquo; said <a href="https://www.ece.gatech.edu/faculty-staff-directory/arijit-raychowdhury">Arijit Raychowdhury</a>, associate professor in Georgia Tech&rsquo;s <a href="http://www.ece.gatech.edu">School of Electrical and Computer Engineering</a>. &ldquo;To accomplish that, we want to bring low-power circuit concepts to these very small devices so they can make decisions on their own. There is a huge demand for very small, but capable robots that do not require infrastructure.&rdquo;</p><p>The cars demonstrated by Raychowdhury and graduate students Ningyuan Cao, Muya Chang and Anupam Golder navigate through an arena floored by rubber pads and surrounded by cardboard block walls. As they search for a target, the robots must avoid traffic cones and each other, learning from the environment as they go and continuously communicating with each other.</p><p>The cars use inertial and ultrasound sensors to determine their location and detect objects around them. Information from the sensors goes to the hybrid ASIC, which serves as the &ldquo;brain&rdquo; of the vehicles. Instructions then go to a Raspberry Pi controller, which sends instructions to the electric motors.</p><p>In palm-sized robots, three major systems consume power: the motors and controllers used to drive and steer the wheels, the processor, and the sensing system. In the cars built by Raychowdhury&rsquo;s team, the low-power ASIC means that the motors consume the bulk of the power. &ldquo;We have been able to push the compute power down to a level where the budget is dominated by the needs of the motors,&rdquo; he said.</p><p>The team is working with collaborators on motors that use micro-electromechanical (MEMS) technology able to operate with much less power than conventional motors.&nbsp;</p><p>&ldquo;We would want to build a system in which sensing power, communications and computer power, and actuation are at about the same level, on the order of hundreds of milliwatts,&rdquo; said Raychowdhury, who is the ON Semiconductor Associate Professor in the School of Electrical and Computer Engineering. &ldquo;If we can build these palm-sized robots with efficient motors and controllers, we should be able to provide runtimes of several hours on a couple of AA batteries. We now have a good idea what kind of computing platforms we need to deliver this, but we still need the other components to catch up.&rdquo;</p><p>In time domain computing, information is carried on two different voltages, encoded in the width of the pulses. That gives the circuits the energy-efficiency advantages of analog circuits with the robustness of digital devices.</p><p>&ldquo;The size of the chip is reduced by half, and the power consumption is one-third what a traditional digital chip would need,&rdquo; said Raychowdhury. &ldquo;We used several techniques in both logic and memory designs for reducing power consumption to the milliwatt range while meeting target performance.&rdquo;</p><p>With each pulse-width representing a different value, the system is slower than digital or analog devices, but Raychowdhury says the speed is sufficient for the small robots. (A milliwatt is a thousandth of a watt).</p><p>&ldquo;For these control systems, we don&rsquo;t need circuits that operate at multiple gigahertz because the devices aren&rsquo;t moving that quickly,&rdquo; he said. &ldquo;We are sacrificing a little performance to get extreme power efficiencies. Even if the compute operates at 10 or 100 megahertz, that will be enough for our target applications.&rdquo;</p><p>The 65-nanometer CMOS chips accommodate both kinds of learning appropriate for a robot. The system can be programmed to follow model-based algorithms, and it can learn from its environment using a reinforcement system that encourages better and better performance over time &ndash; much like a child who learns to walk by bumping into things.&nbsp;</p><p>&ldquo;You start the system out with a predetermined set of weights in the neural network so the robot can start from a good place and not crash immediately or give erroneous information,&rdquo; Raychowdhury said. &ldquo;When you deploy it in a new location, the environment will have some structures that it will recognize and some that the system will have to learn. The system will then make decisions on its own, and it will gauge the effectiveness of each decision to optimize its motion.&rdquo;</p><p>Communication between the robots allow them to collaborate to seek a target.&nbsp;</p><p>&ldquo;In a collaborative environment, the robot not only needs to understand what it is doing, but also what others in the same group are doing,&rdquo; he said. &ldquo;They will be working to maximize the total reward of the group as opposed to the reward of the individual.&rdquo;</p><p>With their ISSCC demonstration providing a proof-of-concept, the team is continuing to optimize designs and is working on a system-on-chip to integrate the computation and control circuitry.</p><p>&ldquo;We want to enable more and more functionality in these small robots,&rdquo; Raychowdhury added. &ldquo;We have shown what is possible, and what we have done will now need to be augmented by other innovations.&rdquo;</p><p>This project was supported by the Semiconductor Research Corporation under grant JUMP CBRIC task ID 2777.006.&nbsp;</p><p><strong>CITATION</strong>: Ningyuan Cao, Muya Chang, Arijit Raychowdhury, &ldquo;A 65 nm 1.1-to-9.1 TOPS/W Hybrid-Digital-Mixed-Signal Computing Platform for Accelerating Model-Based and Model Free Swarm Robotics.&rdquo; (2019 IEEE International Solid-State Circuits Conference).</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Assistance</strong>: John Toon (404-894-6986) (jtoon@gatech.edu)</p><p><strong>Writer</strong>: John Toon</p><p><strong>Want to stay informed about the latest Georgia Tech research? Subscribe to our free monthly e-newsletter at</strong>&nbsp;<a href="http://www.rh.gatech.edu/subscribe">www.rh.gatech.edu/subscribe</a></p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1551839777</created>  <gmt_created>2019-03-06 02:36:17</gmt_created>  <changed>1551839839</changed>  <gmt_changed>2019-03-06 02:37:19</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[An ultra-low power hybrid chip inspired by the brain could help give palm-sized robots the ability to collaborate and learn from their experiences.]]></teaser>  <type>news</type>  <sentence><![CDATA[An ultra-low power hybrid chip inspired by the brain could help give palm-sized robots the ability to collaborate and learn from their experiences.]]></sentence>  <summary><![CDATA[<p>An ultra-low power hybrid chip inspired by the brain could help give palm-sized robots the ability to collaborate and learn from their experiences. Combined with new generations of low-power motors and sensors, the new application-specific integrated circuit (ASIC) &ndash; which operates on milliwatts of power &ndash; could help intelligent swarm robots operate for hours instead of minutes.</p>]]></summary>  <dateline>2019-03-05T00:00:00-05:00</dateline>  <iso_dateline>2019-03-05T00:00:00-05:00</iso_dateline>  <gmt_dateline>2019-03-05 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>618859</item>          <item>618862</item>          <item>618860</item>          <item>618861</item>          <item>618863</item>      </media>  <hg_media>          <item>          <nid>618859</nid>          <type>image</type>          <title><![CDATA[Ultra-low power chip runs robotic car]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[low-power-023.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/low-power-023.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/low-power-023.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/low-power-023.jpg?itok=WXxA967Y]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Robotic car with ultra-low power chip]]></image_alt>                    <created>1551838422</created>          <gmt_created>2019-03-06 02:13:42</gmt_created>          <changed>1551838422</changed>          <gmt_changed>2019-03-06 02:13:42</gmt_changed>      </item>          <item>          <nid>618862</nid>          <type>image</type>          <title><![CDATA[Placing robotic car into a test arena 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[low-power-001.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/low-power-001.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/low-power-001.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/low-power-001.jpg?itok=WQC-AIOT]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Placing robotic car into a test arena]]></image_alt>                    <created>1551838865</created>          <gmt_created>2019-03-06 02:21:05</gmt_created>          <changed>1551838865</changed>          <gmt_changed>2019-03-06 02:21:05</gmt_changed>      </item>          <item>          <nid>618860</nid>          <type>image</type>          <title><![CDATA[hybrid chip operates on milliwatts]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[low-power-018.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/low-power-018.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/low-power-018.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/low-power-018.jpg?itok=c81mot0W]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[hybrid ultra-low power chip]]></image_alt>                    <created>1551838616</created>          <gmt_created>2019-03-06 02:16:56</gmt_created>          <changed>1551838616</changed>          <gmt_changed>2019-03-06 02:16:56</gmt_changed>      </item>          <item>          <nid>618861</nid>          <type>image</type>          <title><![CDATA[Placing robotic car into a test arena]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[low-power-004.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/low-power-004.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/low-power-004.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/low-power-004.jpg?itok=tk8DpuJ0]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Placing robotic car into a test arena]]></image_alt>                    <created>1551838750</created>          <gmt_created>2019-03-06 02:19:10</gmt_created>          <changed>1551838750</changed>          <gmt_changed>2019-03-06 02:19:10</gmt_changed>      </item>          <item>          <nid>618863</nid>          <type>image</type>          <title><![CDATA[Research team for ultra-low power chips]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[low-power-020.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/low-power-020.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/low-power-020.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/low-power-020.jpg?itok=wIjClaCb]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Research team for ultra-low power hybrid chips]]></image_alt>                    <created>1551839033</created>          <gmt_created>2019-03-06 02:23:53</gmt_created>          <changed>1551839033</changed>          <gmt_changed>2019-03-06 02:23:53</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="180716"><![CDATA[robotic car]]></keyword>          <keyword tid="7264"><![CDATA[autonomous]]></keyword>          <keyword tid="180721"><![CDATA[time-domain]]></keyword>          <keyword tid="180718"><![CDATA[ultra-low power]]></keyword>          <keyword tid="3251"><![CDATA[chip]]></keyword>          <keyword tid="180719"><![CDATA[hybrid chip]]></keyword>          <keyword tid="1912"><![CDATA[brain]]></keyword>          <keyword tid="178517"><![CDATA[neural network]]></keyword>          <keyword tid="4897"><![CDATA[collaborative]]></keyword>          <keyword tid="856"><![CDATA[Intelligence]]></keyword>      </keywords>  <core_research_areas>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="618439">  <title><![CDATA[When Sand-Slithering Snakes Behave Like Light Waves]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Desert snakes slithering across the sand at night can encounter obstacles such as plants or twigs that alter the direction of their travel. While studying that motion to learn how limbless animals control their bodies in such environments, researchers discovered that snakes colliding with these obstacles mimic aspects of light or subatomic particles when they encounter a diffraction grating.</p><p>The effect of this &ldquo;mechanical diffraction&rdquo; allowed researchers to observe how the snakes&rsquo; trajectories were altered through passive mechanisms governed by the skeletal and muscular dynamics of the animals&rsquo; propagating body waves. The researchers studied live snakes as they slithered through an obstacle made up of six force-sensitive rigid pegs that buckled the animals&rsquo; bodies, changing their paths in predictable ways.</p><p>The results, described February 25 in the journal <em>Proceedings of the National Academy of Sciences</em>, indicate that the Western Shovel-nosed snakes (<em>Chionactis occipitalis</em>) do not deliberately change direction when they encounter obstacles while speeding across the sand. Understanding the movement of these limbless animals could help engineers improve the control of autonomous search and rescue robots designed to operate on sand, grass and other complex environments.&nbsp;</p><p>&ldquo;The idea behind passive dynamics is that there are waveform shape changes being made by the animal that are driven entirely by the passive properties of their bodies,&rdquo; said Perrin Schiebel, a recent Ph.D. graduate of the <a href="http://www.physics.gatech.edu">School of Physics</a> at the Georgia Institute of Technology. &ldquo;Instead of sending a signal to activate a muscle, the interaction of the snakes&rsquo; bodies with the external environment is what causes the shape change. The forces of the obstacles are pushing the snake bodies into a new shape.&rdquo;</p><p>The colorful shovel-nosed snake normally uses a sinusoidal S-shaped wave to move across the deserts of the Southwest United States. Running into rigid pegs in a laboratory environment doesn&rsquo;t lead it to actively change that waveform, which Schiebel and colleagues studied using high-speed video cameras with eight different animals.&nbsp;</p><p>In a study supported by the National Science Foundation, Army Research Office, Defense Advanced Projects Agency, and a National Defense Science and Engineering Graduate Fellowship, the researchers used 253 snake trips to build up a diffraction pattern. Remarkably, the pattern also revealed that the scattering directions were &ldquo;quantized&rdquo; such that the probability of finding a snake behind the array could be represented in a pattern mimicking wave interference. A computational model was able to capture the pattern, demonstrating how the snakes&rsquo; direction would be altered by obstacle encounters via passive body buckling.</p><p>&ldquo;One problem with robots moving in the real world is that we don&rsquo;t yet have principles by which we can understand how best to control these robots on granular surfaces like sand, leaf litter, rubble or grass,&rdquo; said <a href="http://www.physics.gatech.edu/user/daniel-goldman">Daniel Goldman</a>, Dunn Family Professor in Georgia Tech&rsquo;s School of Physics and a researcher in the Petit Institute for Bioengineering and Bioscience. &ldquo;The point of this study was to try to understand how limbless locomotors, which have long bodies that can bend in interesting ways using potentially complicated neuromechanical control schemes, manage to move through complicated terrain.&rdquo;</p><p>The snake experiment was suggested by a robotic study done by postdoctoral fellow Jennifer Rieser, who found similar behavior among robots encountering obstacles.</p><p>&ldquo;The robot tends to have aspects that mimic features of the subatomic world &mdash; the quantum world,&rdquo; Goldman explained. &ldquo;When it collides with barriers, a robot propagates through those barriers using waves of body bending. Its trajectory deviates as it exits the barriers, and many repeated trials reveal a &lsquo;lumpy&rsquo; scattering pattern, analogous to experiments. We realized that we could use this surprising and beautiful phenomenon, classical physics but with self-propulsion a key feature, as a scattering experiment to interrogate the control scheme used by the snakes.&rdquo;</p><p>Experimentally, the researchers used a &ldquo;snake arena&rdquo; covered with shag carpet to mimic sand. Undergraduate students Alex Hubbard and Lillian Chen released the snakes one at a time into the arena and encouraged them to slither through the grating.</p><p>The eyes of the desert snakes are naturally covered with scales to protect them. The researchers used children&rsquo;s face paint to temporarily &ldquo;blindfold&rdquo; the animals so they would not be distracted by the researchers. The paint did not harm the animals.</p><p>&ldquo;When we put the snakes down in the arena, they started moving using the same waveform they use on desert sand,&rdquo; explained Schiebel. &ldquo;They would then encounter the dowel grating, pass through it, and continue on the other side still using that waveform.&rdquo;</p><p>Instead of continuing to travel through the arena in a straight line, the snakes would exit at a different angle, though they did not grab the posts or use them to assist their movement. Schiebel worked with Zeb Rocklin, a Georgia Tech assistant professor of physics, to model the directional changes. The model showed how simple interactions between the snakes&#39; wave pattern and the grating produce patterns of favored scattering directions.</p><p>&ldquo;We think the snake is essentially operating in a model that control engineers would consider &lsquo;open loop,&rsquo;&rdquo; said Goldman. &ldquo;It is setting a particular motor program on its body, which generates the characteristic wave pattern, and when it collides with the obstacle, its body mechanics allow it to deform and move the posts without degrading its speed.&rdquo;</p><p>Goldman believes the work could help developers of snake-like robots improve their control schemes.</p><p>&ldquo;We think that our discoveries of the role of passive dynamics in the snake can facilitate new snake robot designs that will enable them to move through complex environments more fluidly,&rdquo; he said. &ldquo;The goal would be to build search and rescue robots that can get into these complex environments and help first responders.&rdquo;</p><p>And as a bonus, Goldman said, &ldquo;We find that the richness of interactions between self-propelled systems like snakes and robots with their environment is fascinating from the standpoint of &lsquo;active matter&rsquo; physics.&rdquo;</p><p><em>This work was supported by National Science Foundation Physics of Living Systems program awards PHY-1205878, PHY-1150760 and CMMI-1361778; by the Army Research Office through award W911NF-11-1-0514; U.S. DoD National Defense Science and Engineering Graduate Fellowship (NDSEG) 32 CFR 168a; and by the Defense Advanced Research Projects Agency (DARPA) Young Faculty Award. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the sponsor organizations.</em></p><p><strong>CITATION</strong>: Perrin E. Schiebel, et al., &ldquo;Mechanical diffraction reveals the role of passive dynamics in a slithering snake,&rdquo; (Proceedings of the National Academy of Sciences, 2019).</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986) (jtoon@gatech.edu)</p><p><strong>Writer</strong>: John Toon</p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1551124230</created>  <gmt_created>2019-02-25 19:50:30</gmt_created>  <changed>1551373370</changed>  <gmt_changed>2019-02-28 17:02:50</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new study shows how the motion of snakes moving across a sandy surface can be affected by obstacles.]]></teaser>  <type>news</type>  <sentence><![CDATA[A new study shows how the motion of snakes moving across a sandy surface can be affected by obstacles.]]></sentence>  <summary><![CDATA[<p>Desert snakes slithering across the sand at night can encounter obstacles such as plants or twigs that alter the direction of their travel -- and cause them to mimic aspects of light or subatomic particles when they encounter a diffraction grating.</p>]]></summary>  <dateline>2019-02-25T00:00:00-05:00</dateline>  <iso_dateline>2019-02-25T00:00:00-05:00</iso_dateline>  <gmt_dateline>2019-02-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>618431</item>          <item>618433</item>          <item>618432</item>          <item>618434</item>      </media>  <hg_media>          <item>          <nid>618431</nid>          <type>image</type>          <title><![CDATA[Studying snakes on granular surfaces]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[snakes-as-waves-012.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/snakes-as-waves-012.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/snakes-as-waves-012.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/snakes-as-waves-012.jpg?itok=_WydX_zm]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[studying snakes on a granular surface]]></image_alt>                    <created>1551122968</created>          <gmt_created>2019-02-25 19:29:28</gmt_created>          <changed>1551122968</changed>          <gmt_changed>2019-02-25 19:29:28</gmt_changed>      </item>          <item>          <nid>618433</nid>          <type>image</type>          <title><![CDATA[Snake moving through peg array]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[snakes-as-waves-008.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/snakes-as-waves-008.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/snakes-as-waves-008.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/snakes-as-waves-008.jpg?itok=pplw19kR]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Snake moving through peg array]]></image_alt>                    <created>1551123245</created>          <gmt_created>2019-02-25 19:34:05</gmt_created>          <changed>1551123245</changed>          <gmt_changed>2019-02-25 19:34:05</gmt_changed>      </item>          <item>          <nid>618432</nid>          <type>image</type>          <title><![CDATA[Perrin Schiebel with snake arena]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[snakes-as-waves-007.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/snakes-as-waves-007.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/snakes-as-waves-007.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/snakes-as-waves-007.jpg?itok=eA6buMFX]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researcher Perrin Schiebel with snake]]></image_alt>                    <created>1551123114</created>          <gmt_created>2019-02-25 19:31:54</gmt_created>          <changed>1551123114</changed>          <gmt_changed>2019-02-25 19:31:54</gmt_changed>      </item>          <item>          <nid>618434</nid>          <type>image</type>          <title><![CDATA[Snake research team]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[snakes-as-waves-020.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/snakes-as-waves-020.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/snakes-as-waves-020.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/snakes-as-waves-020.jpg?itok=wYZctqyR]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Snake research team]]></image_alt>                    <created>1551123354</created>          <gmt_created>2019-02-25 19:35:54</gmt_created>          <changed>1551123354</changed>          <gmt_changed>2019-02-25 19:35:54</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="126571"><![CDATA[go-PetitInstitute]]></keyword>          <keyword tid="169001"><![CDATA[Snake]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="169242"><![CDATA[sand]]></keyword>          <keyword tid="180635"><![CDATA[passive dynamics]]></keyword>          <keyword tid="180632"><![CDATA[light wave]]></keyword>          <keyword tid="7120"><![CDATA[wave]]></keyword>          <keyword tid="180631"><![CDATA[diffraction]]></keyword>          <keyword tid="47881"><![CDATA[Dan Goldman]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="615816">  <title><![CDATA[Executive Director Selected at Institute for Robotics and Intelligent Machines]]></title>  <uid>27303</uid>  <body><![CDATA[<p>The Georgia Institute of Technology has selected Seth Hutchinson as the new executive director of the <a href="http://www.robotics.gatech.edu/">Institute for Robotics and Intelligent Machines</a> (IRIM). <a href="https://www.cc.gatech.edu/~seth/">Hutchinson</a> is a professor and KUKA Chair for Robotics in Georgia Tech&rsquo;s College of Computing and has served as associate director of IRIM.</p><p>Before joining Georgia Tech in January 2018, he was a professor of electrical and computer engineering at the University of Illinois at Urbana-Champaign. Hutchinson holds a bachelor of science, master of science and Ph.D. in electrical engineering from Purdue University.</p><p>&ldquo;Seth is internationally known for his work in robotics as evidenced by his more than 200 publications, his editor-in-chief role of the <em>IEEE Transactions on Robotics</em> and his recent selection as president-elect of the IEEE Robotics and Automation Society,&rdquo; said Chaouki Abdallah, Georgia Tech&rsquo;s executive vice president for research. &ldquo;I am pleased that he will be the new executive director of Georgia Tech&rsquo;s Institute for Robotics and Intelligent Machines, and I look forward to working with him toward the goal of making Georgia Tech the leader in robotics, autonomy and manufacturing.&rdquo;&nbsp;</p><p>Hutchinson&rsquo;s research interests lie in vision-based control, motion planning, planning under uncertainty, pursuit-evasion, localization and mapping, locomotion and bio-inspired robotics. Hutchinson is the coauthor of two books, &ldquo;<em>Principles of Robot Motion - Theory, Algorithms, and Implementations</em>,&rdquo; and &ldquo;<em>Robot Modeling and Control</em>.&rdquo;</p><p>&ldquo;The robotics research happening here at Georgia Tech is among the best in the world, from actuators to high-level reasoning,&rdquo; he said. &ldquo;I honestly cannot think of a place I&rsquo;d rather be right now than here, working with this group of people.&rdquo;</p><p>At Georgia Tech, IRIM serves as an umbrella under which robotics researchers, educators and students from across campus can come together to advance the many high-powered and diverse robotics activities.&nbsp;</p><p>IRIM&rsquo;s mission is to create new and exciting opportunities for faculty collaboration; educate the next generation of robotics experts, entrepreneurs, and academic leaders; and partner with industry and government to pursue truly transformative robotics research. IRIM serves more than 90 faculty members, 180 graduate students and 40 robotics labs. The robotics program at Georgia Tech attracts more than $60 million in research annually.</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986) (jtoon@gatech.edu).</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1546527473</created>  <gmt_created>2019-01-03 14:57:53</gmt_created>  <changed>1546527766</changed>  <gmt_changed>2019-01-03 15:02:46</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Seth Hutchinson has been named executive director of the Institute for Robotics and Intelligent Machines.]]></teaser>  <type>news</type>  <sentence><![CDATA[Seth Hutchinson has been named executive director of the Institute for Robotics and Intelligent Machines.]]></sentence>  <summary><![CDATA[<p>The Georgia Institute of Technology has selected Seth Hutchinson as the new executive director of the Institute for Robotics and Intelligent Machines (IRIM). Hutchinson is a professor and KUKA Chair for Robotics in Georgia Tech&rsquo;s College of Computing and has served as associate director of IRIM.</p>]]></summary>  <dateline>2019-01-03T00:00:00-05:00</dateline>  <iso_dateline>2019-01-03T00:00:00-05:00</iso_dateline>  <gmt_dateline>2019-01-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>615815</item>          <item>615814</item>      </media>  <hg_media>          <item>          <nid>615815</nid>          <type>image</type>          <title><![CDATA[Seth Hutchinson, executive director of IRIM Photo 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[seth-hutchinson-9718.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/seth-hutchinson-9718.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/seth-hutchinson-9718.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/seth-hutchinson-9718.jpg?itok=pH34FHnM]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Seth Hutchinson with robotics lab]]></image_alt>                    <created>1546526815</created>          <gmt_created>2019-01-03 14:46:55</gmt_created>          <changed>1546526815</changed>          <gmt_changed>2019-01-03 14:46:55</gmt_changed>      </item>          <item>          <nid>615814</nid>          <type>image</type>          <title><![CDATA[Seth Hutchinson, executive director of IRIM]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[seth-hutchinson-9688.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/seth-hutchinson-9688.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/seth-hutchinson-9688.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/seth-hutchinson-9688.jpg?itok=S4b0XovF]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Seth Hutchinson with robotics lab]]></image_alt>                    <created>1546526715</created>          <gmt_created>2019-01-03 14:45:15</gmt_created>          <changed>1546526715</changed>          <gmt_changed>2019-01-03 14:45:15</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="169760"><![CDATA[Seth Hutchinson]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="174636"><![CDATA[intelligent machines]]></keyword>          <keyword tid="6503"><![CDATA[automation]]></keyword>          <keyword tid="78271"><![CDATA[IRIM]]></keyword>      </keywords>  <core_research_areas>          <term tid="39461"><![CDATA[Manufacturing, Trade, and Logistics]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="613266">  <title><![CDATA[How the Elephant Uses its Trunk to Eat]]></title>  <uid>27303</uid>  <body><![CDATA[<p>A new study demonstrates the physics that elephants use to feed themselves the massive quantities of leaves, fruit and roots needed to sustain their multi-ton bodies.&nbsp;</p><p>A human can pick up multiple objects at once by squeezing them together with both hands and arms. An African elephant also picks up many items at once but with only one appendage&mdash;its soft, heavy trunk. How the elephant solves this challenge could provide inspiration for future robotics.&nbsp;</p><p>A wild African elephant eats rapidly, consuming 190 grams of food a minute, to provide adequate fuel for its vast bulk. &ldquo;Elephants are in a rush when they are eating,&rdquo; said David L. Hu, associate professor in the School of Mechanical Engineering and the School of Biology at the Georgia Institute of Technology. The elephant diet consists of large volumes of plant materials such as leaves, fruit and roots. To eat these, elephants sweep loose items into a pile and crush them into a manageable solid that can be picked up by the trunk.&nbsp;</p><p>&ldquo;They don&rsquo;t just use the trunk&rsquo;s strong muscles to squeeze the plants together,&rdquo; said Hu. &ldquo;The elephants also use the weight of the trunk, and they do that by forming a joint in the trunk. The trunk below the joint becomes a stiff pillar that applies weight to the pile of plant materials.&rdquo;&nbsp;</p><p>About 30 percent of the applied force is derived from the pillar&rsquo;s weight alone, and about 70 percent from exerting muscular effort, according to a new study published in the <em>Journal of the Royal Society Interface</em> by Hu and colleagues at Georgia Tech, the Rochester Institute of Technology and Zoo Atlanta.&nbsp;</p><p>The African elephant can raise or lower the trunk joint&rsquo;s height by up to 11 centimeters to increase or reduce the applied force. &ldquo;When elephants need more force, the joint is higher up on the trunk,&rdquo; Hu said. Elephant trunks weigh about 150 kilograms and have 40,000 muscles. &ldquo;The huge number of muscles in the trunk allows the elephant great freedom for where it puts this joint.&rdquo;</p><p>Hu and his colleagues studied a 34-year-old female African elephant (Loxodonta africana) over several weeks in the summer of 2017. All experiments were supervised by the staff at Zoo Atlanta. Food was arranged by hand into a pile in the center of a force plate to measure how much force the animal generated.&nbsp;</p><p>The elephant&rsquo;s trunk is similar to other boneless organs in nature such as the octopus&rsquo;s arm and the human tongue. But unlike an octopus&rsquo;s arm, an elephant&rsquo;s trunk is heavy enough to provide significant force on an object without muscular pressure. This is the first study to show that an animal can use the weight of its own appendage to help apply force and the first with a live elephant to understand forces that it can apply to materials.&nbsp;</p><p>Using mathematical models, the researchers found that the greater the number of objects to be squeezed and picked up, the greater the force that must be applied.&nbsp;</p><p>&ldquo;Picking up two objects requires very little force to press them together, while picking up 40,000 objects requires a lot of force,&rdquo; Hu said. This principle was tested experimentally with the live elephant by presenting multiple food items varying in number from four to 40,000 in number. The experiments showed that the elephant could vary forces applied with its trunk by a factor of four depending on the number of food items to be picked up.</p><p>This research could have applications in robotics, where heavier machines would appear to have few advantages over smaller ones. But, in the future, heavy robotic manipulators could be designed with several adjustable joints that use the device&rsquo;s own weight to provide adjustable pressure and save energy. There are currently no commercial robots designed to apply their own weight to objects, Hu noted.&nbsp;</p><p>&ldquo;You could have future robots with several joints, which could apply various weight pressures below joints to help compress objects together for lifting them efficiently,&rdquo; said Hu. &ldquo;This would allow you to use the weight of the joints themselves to provide force instead of relying on batteries and extra motors to apply these forces, and that would mean using less energy. For instance, you could have a heavy robot with four joints, and by bending the top joint, the weight below it could apply a load. If you wanted to provide less weight pressure, you could instead bend the second-from-the-top joint. This study shows that there are some advantages for robots in being big and heavy.&rdquo;</p><p>African elephants like the ones in this study have two muscular extensions at the tip of their trunk resembling a pair of fingers that also could be studied as models for future robotics. It&rsquo;s not well known that elephants have such projections, and this understanding could inform work that is already underway. &ldquo;The elephant&rsquo;s technique with these extensions might be used to develop soft robotic grippers that can pick up delicate items such as fruit without damaging them,&rdquo; Hu noted.</p><p><em>This work was supported by the U.S. Army Research Laboratory and the U.S. Army Research Office Mechanical Sciences Division, Complex Dynamics and Systems Program, under contract W911NF-12-R-0011.</em></p><p><strong>CITATION</strong>: Jianing Wu, et al., &ldquo;Elephant trunks form joints to squeeze together small objects,&rdquo; (Journal of the Royal Society Interface 15, 2018) <a href="http://dx.doi.org/10.1098/rsif.2018.0377">http://dx.doi.org/10.1098/rsif.2018.0377</a></p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(jtoon@gatech.edu).</p><p><strong>Writer</strong>: John Tibbetts</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1540429604</created>  <gmt_created>2018-10-25 01:06:44</gmt_created>  <changed>1540469233</changed>  <gmt_changed>2018-10-25 12:07:13</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new study shows how elephants use the trunks to compress food before eating it.]]></teaser>  <type>news</type>  <sentence><![CDATA[A new study shows how elephants use the trunks to compress food before eating it.]]></sentence>  <summary><![CDATA[<p>A new study demonstrates the physics that elephants use to feed themselves the massive quantities of leaves, fruit and roots needed to sustain their multi-ton bodies.&nbsp;</p>]]></summary>  <dateline>2018-10-24T00:00:00-04:00</dateline>  <iso_dateline>2018-10-24T00:00:00-04:00</iso_dateline>  <gmt_dateline>2018-10-24 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>613263</item>          <item>613264</item>          <item>613265</item>      </media>  <hg_media>          <item>          <nid>613263</nid>          <type>image</type>          <title><![CDATA[Elephant at Zoo Atlanta]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[elephant_tara_ZA_2488-b.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/elephant_tara_ZA_2488-b.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/elephant_tara_ZA_2488-b.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/elephant_tara_ZA_2488-b.jpg?itok=_e70oxSX]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Elephant at Zoo Atlanta]]></image_alt>                    <created>1540429042</created>          <gmt_created>2018-10-25 00:57:22</gmt_created>          <changed>1540429042</changed>          <gmt_changed>2018-10-25 00:57:22</gmt_changed>      </item>          <item>          <nid>613264</nid>          <type>image</type>          <title><![CDATA[Elephant research enclosure]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[elephant.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/elephant.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/elephant.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/elephant.jpg?itok=UbMxVsRL]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Elephant in experimental enclosure at Zoo Atlanta]]></image_alt>                    <created>1540429170</created>          <gmt_created>2018-10-25 00:59:30</gmt_created>          <changed>1540429170</changed>          <gmt_changed>2018-10-25 00:59:30</gmt_changed>      </item>          <item>          <nid>613265</nid>          <type>image</type>          <title><![CDATA[Elephant trunk]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[eleplant-trunk_5340.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/eleplant-trunk_5340.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/eleplant-trunk_5340.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/eleplant-trunk_5340.jpg?itok=SrXISx8D]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Closeup of elephant trunk]]></image_alt>                    <created>1540429265</created>          <gmt_created>2018-10-25 01:01:05</gmt_created>          <changed>1540429265</changed>          <gmt_changed>2018-10-25 01:01:05</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="96651"><![CDATA[elephant]]></keyword>          <keyword tid="179490"><![CDATA[elephant trunk]]></keyword>          <keyword tid="6765"><![CDATA[zoo atlanta]]></keyword>          <keyword tid="297"><![CDATA[David Hu]]></keyword>          <keyword tid="126571"><![CDATA[go-PetitInstitute]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="611967">  <title><![CDATA[Drones, Driverless Cars and Difficult Decisions ]]></title>  <uid>27918</uid>  <body><![CDATA[<p>Robots are here. They&rsquo;ve entered our daily lives and can be found in our homes, hospitals and our streets.&nbsp;</p><p>These robot and human interactions raise a series of questions that the general public and lawmakers must face.&nbsp;</p><p>&ldquo;When robots start truly engaging with us, what does that really mean?&rdquo; asked Magnus Egerstedt, the Steve W. Chaddick School Chair and a professor in the School of Electrical and Computer Engineering. &ldquo;And what does it mean for us to interact with them?&rdquo;</p><p>The emerging debate on ethics and robotics was the focus of two panel discussion Georgia Tech hosted Tuesday in Washington, D.C. The event &ndash; &ldquo;Drones, Driverless Cars and Difficult Decisions&rdquo; examined the expectations humans have about robots&rsquo; capabilities and limits. They also spoke of the responsibilities that researchers, scientists, corporations and policymakers have as well.&nbsp;&nbsp;</p><p>A luncheon roundtable, held on Capitol Hill, attracted congressional staffers, representatives from national associations and others from the D.C. policy community.</p><p>During the evening roundtable held at the National Press Club, reporters from&nbsp;<em>Inside Higher Ed</em>,&nbsp;<em>The Washington Post</em>and&nbsp;<em>U.S. News &amp; World Report&nbsp;</em>asked questions and helped guide the conversation.</p><p>In addition to Egerstedt two other Georgia Tech professors served on the panel: Ronald Arkin, director of the Mobile Robot Laboratory in the College of Computing, and Ayanna Howard, chair of the School of Interactive Computing in the College of Computing and the the Linda J. and Mark C. Smith Chair professor.</p><p>The other four panelists were: Cindy Grimm, an associate professor of mechanical engineering at Oregon State University; Benjamin Kuipers, a professor of computer science and engineering at the University of Michigan; Bertram Malle, a professor in the department of cognitive, linguistic and psychological sciences at Brown University; and Reid Simmons, a research professor in robotics and computer science at Carnegie Mellon University.&nbsp;</p><p>Arkin said for years researchers focused on making new discoveries without paying as much attention to the implications. But there is acknowledgement of the responsibilities that roboticists have to make sure they don&rsquo;t promise more than they can deliver.&nbsp;</p><p>He noted the complexities in programming a robot on how to be good.&nbsp;</p><p>&ldquo;We don&rsquo;t have the answers for all this yet,&rdquo; he said. &ldquo;We are just beginning to make forays into this space &hellip; Please be patient.&rdquo;&nbsp;</p>]]></body>  <author>Laura Diamond</author>  <status>1</status>  <created>1537924597</created>  <gmt_created>2018-09-26 01:16:37</gmt_created>  <changed>1537967334</changed>  <gmt_changed>2018-09-26 13:08:54</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech hosts a panel discussion in Washington, D.C. about the emerging debate on ethics and robotics.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech hosts a panel discussion in Washington, D.C. about the emerging debate on ethics and robotics.]]></sentence>  <summary><![CDATA[<p>Georgia Tech hosts a panel discussion in Washington, D.C. about the emerging debate on ethics and robotics.</p>]]></summary>  <dateline>2018-09-25T00:00:00-04:00</dateline>  <iso_dateline>2018-09-25T00:00:00-04:00</iso_dateline>  <gmt_dateline>2018-09-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[<p>National Roundtable Initiative</p><p>Georgia Tech&rsquo;s national media roundtable initiative brings together thought leaders from different organizations to discuss important issues. It is the result of a partnership with the Georgia Tech&rsquo;s Office of Development, Institute Communications, Office of Government and Community Relations and individual colleges and units.&nbsp;</p><p>This was Georgia Tech&rsquo;s ninth roundtable and past events have examined the changing landscape in higher education, college admissions trends, lessons learned since Hurricane Katrina and how to attract more female engineers and more African-American men into science, technology, engineering and mathematics.</p>]]></sidebar>  <email><![CDATA[laura.diamond@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>For media inquiries:&nbsp;Laura Diamond,&nbsp;<a href="mailto:laura.diamond@gatech.edu">laura.diamond@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>611973</item>      </media>  <hg_media>          <item>          <nid>611973</nid>          <type>image</type>          <title><![CDATA[Robotarium Robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Dn9CkHzUwAAz07W.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Dn9CkHzUwAAz07W.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Dn9CkHzUwAAz07W.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Dn9CkHzUwAAz07W.jpg?itok=vLMXD6XQ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Robotarium Robot]]></image_alt>                    <created>1537967316</created>          <gmt_created>2018-09-26 13:08:36</gmt_created>          <changed>1537967316</changed>          <gmt_changed>2018-09-26 13:08:36</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="133"><![CDATA[Special Events and Guest Speakers]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="133"><![CDATA[Special Events and Guest Speakers]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="1496"><![CDATA[Ethics]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>          <topic tid="71901"><![CDATA[Society and Culture]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="603589">  <title><![CDATA[The Minds of the New Machines - Machine Learning at Georgia Tech]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Machine learning has been around for decades, but the advent of big data and more powerful computers has increased its impact significantly&nbsp;&mdash; &shy;moving machine learning beyond pattern recognition and natural language processing into a broad array of scientific disciplines.</p><p>A subcategory of artificial intelligence, machine learning deals with the construction of algorithms that enable computers to learn from and react to data rather than following explicitly programmed instructions. &ldquo;Machine-learning algorithms build a model based on inputs and then use that model to make other hypotheses, predictions, or decisions,&rdquo; explained&nbsp;<a href="https://www.cc.gatech.edu/people/irfan-essa">Irfan Essa</a>, professor and associate dean in Georgia Tech&rsquo;s&nbsp;<a href="http://www.cc.gatech.edu/">College of Computing</a>&nbsp;who also directs the Institute&rsquo;s&nbsp;<a href="http://ml.gatech.edu/">Center for Machine Learning</a>.</p><p>Established in June 2016, the Center for Machine Learning is comprised of researchers from six colleges and 13 schools at Georgia Tech&nbsp;&mdash; a number that keeps growing. &ldquo;Among our goals is to better coordinate research efforts across campus, serve as a home for machine learning leaders, and train the next generation of leaders,&rdquo; Essa said, referring to Georgia Tech&rsquo;s new&nbsp;<a href="http://www.rh.gatech.edu/features/minds-new-machines#phd-program">Ph.D. program in machine learning</a>.</p><p>Within the center, researchers are striving to advance both basic and applied science. &ldquo;For example, one foundational goal is to really understand deep learning at its core,&rdquo; Essa said. &ldquo;We want to develop new theories and innovative algorithms, rather than just using deep learning as a black box for inputs and outputs.&rdquo; And on the applied research front, the center has seven focal areas: health care, education, logistics, social networks, the financial sector, information security, and robotics.</p><p>See the <a href="http://www.rh.gatech.edu/features/minds-new-machines">complete article</a> from <em>Research Horizons</em> magazine.</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1520622000</created>  <gmt_created>2018-03-09 19:00:00</gmt_created>  <changed>1520622091</changed>  <gmt_changed>2018-03-09 19:01:31</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers are advancing the basic and applied science of machine learning.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers are advancing the basic and applied science of machine learning.]]></sentence>  <summary><![CDATA[<p>Machine learning has been around for decades, but the advent of big data and more powerful computers has increased its impact significantly. Georgia Tech researchers are&nbsp;advancing&nbsp;both basic and applied science involved.</p>]]></summary>  <dateline>2018-03-09T00:00:00-05:00</dateline>  <iso_dateline>2018-03-09T00:00:00-05:00</iso_dateline>  <gmt_dateline>2018-03-09 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[New theories and innovative algorithms support improved prediction and decision-making.]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>603587</item>          <item>603588</item>      </media>  <hg_media>          <item>          <nid>603587</nid>          <type>image</type>          <title><![CDATA[Minds of the New Machines]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[machines.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/machines.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/machines.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/machines.jpg?itok=IzTGpecK]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Graphic for Minds of the New Machines]]></image_alt>                    <created>1520621292</created>          <gmt_created>2018-03-09 18:48:12</gmt_created>          <changed>1520621292</changed>          <gmt_changed>2018-03-09 18:48:12</gmt_changed>      </item>          <item>          <nid>603588</nid>          <type>image</type>          <title><![CDATA[Anticipatory intelligence]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[briscoe-kira.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/briscoe-kira.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/briscoe-kira.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/briscoe-kira.jpg?itok=wDTlzMez]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Erica Briscoe and Zsolt Kira with news screens]]></image_alt>                    <created>1520621446</created>          <gmt_created>2018-03-09 18:50:46</gmt_created>          <changed>1520621446</changed>          <gmt_changed>2018-03-09 18:50:46</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="9167"><![CDATA[machine learning]]></keyword>          <keyword tid="5660"><![CDATA[algorithms]]></keyword>          <keyword tid="2835"><![CDATA[ai]]></keyword>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="177352"><![CDATA[Iran Essa]]></keyword>          <keyword tid="173555"><![CDATA[Center for Machine Learning]]></keyword>      </keywords>  <core_research_areas>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="602925">  <title><![CDATA[ The Next Frontier in Mechanical Engineering]]></title>  <uid>27560</uid>  <body><![CDATA[<p>Drone technology is quickly evolving &ndash;no longer just for military use, these flying robots now have a place within commercial enterprise. Also known as unmanned aerial vehicles, drones today have practical applications, like delivering packages for Amazon or allowing realtors to take aerial video to show off a sale property.</p><p>To date, there is usually a weight limit on how much a drone can carry, restricting its usefulness. But Jonathan Rogers, assistant professor at the George W. Woodruff School of Mechanical Engineering, is trying to change that. He is designing, building and programming robotic drones that can link up and carry larger, heavier objects as a unit</p><p>&ldquo;In my lab, we are working with multiple drones that lift and fly packages together,&rdquo; said Rogers. &ldquo;This involves distributing heavy lift capabilities into a number of small drone units that can then organize themselves to pick the object up.&rdquo;</p><p>With exceptional portability, unobtrusive size and remote control, drones are ideal for situations that are dangerous for humans. Rogers has designed the world&rsquo;s first heavy lift small drones &ndash; robots that can work together to lift and evacuate wounded soldiers from the battlefield or civilians from a disaster area. Theoretically, three to four man-portable robots fly out together, connect to the person, and lift them 500 yards out of harm&rsquo;s way.</p><p>Each drone has eight large propellers and can fold up into a backpack for portability. The drone can lift a 65 pound object, and with three or four drones working together, a human can be lifted. Rogers explains that it&rsquo;s all about thrust density, a term he invented.</p><p>&ldquo;Determining how much thrust you can pack into a small area is important when you are using multiple vehicles to lift a specific object,&rdquo; said Rogers. &ldquo;When you pack a large amount of thrust into a small object, the laws of physics work against you, so you need more power. That&rsquo;s why we only fly the soldiers about 500 yards away after they are lifted from the battlefield.&rdquo;</p><p>The drones Rogers works on are part of a new field called cooperative flight control, where multiple drones connect to an object that they know very little about and move it in a stable way. Rogers has named these drones &ldquo;modular vertical lift robots,&rdquo; and they also have useful implications for package delivery.</p><p>Currently, Rogers and his team are working on a funded project with the Georgia Tech Research Institute (GTRI) to test multiple vertical lift robots that connect up to deliver supplies. The robots are programed to take into account flexible logistics by connecting to the object (payload) and determining its weight and size and how to move it in a stable way. The small robots work together as a team, known as multi agent control.</p><p>&ldquo;Right now we are most concerned with ensuring the robots fly in a stable way once they analyze the payload and mass center,&rdquo; said Rogers. &ldquo;We are calling this autonomous flightworthiness determination (AFWD), and it&rsquo;s a topic in the field that no one else has explored.&rdquo;</p><p>A major challenge for AFWD and cooperative flight control is determining how the drones are going to attach to the payload. Rogers has developed a docking apparatus, so the robot vehicles can attach to the object. When a flexible payload, like a human, doesn&rsquo;t have docks, Rogers is looking into using manipulators with soft gripper technology on the robots. Then the robots will have a flexible way of grasping the human.</p><p>In the next 20 to 30 years, Rogers predicts that mobile robots moving together will be employed in everyday situations. But a key hurdle remains &ndash; normalizing the technology to ensure it is compatible with and trusted by humans.</p><p>&ldquo;I am really invested in creating new mechanisms and autonomy algorithms that allow robots to serve a beneficial purpose in society,&rdquo; said Rogers. &ldquo;The modular vehicle lift robot that can operate during disaster situations is a great example of the type of technology that can benefit people. Also, the drone docks we are designing will be a key piece of equipment that hundreds of companies can use to do their jobs better. Making an impact on society is really our goal.&rdquo;</p><p>Rogers hopes to start lifting objects this summer.</p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1519674890</created>  <gmt_created>2018-02-26 19:54:50</gmt_created>  <changed>1519678552</changed>  <gmt_changed>2018-02-26 20:55:52</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Drones are being designed, built and programmed to link up and carry larger, heavier objects as a unit.]]></teaser>  <type>news</type>  <sentence><![CDATA[Drones are being designed, built and programmed to link up and carry larger, heavier objects as a unit.]]></sentence>  <summary><![CDATA[<p>To date, there is usually a weight limit on how much a drone can carry, restricting its usefulness. But Jonathan Rogers, assistant professor at the George W. Woodruff School of Mechanical Engineering, is trying to change that. He is designing, building and programming robotic drones that can link up and carry larger, heavier objects as a unit.</p>]]></summary>  <dateline>2018-02-26T00:00:00-05:00</dateline>  <iso_dateline>2018-02-26T00:00:00-05:00</iso_dateline>  <gmt_dateline>2018-02-26 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Drones work together to save wounded soldiers ]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[georgia.parmelee@coe.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Georgia Parmelee<br />College of Engineering<br />404-385-0181<br />georgia.parmelee@coe.gatech.edu</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>602922</item>      </media>  <hg_media>          <item>          <nid>602922</nid>          <type>image</type>          <title><![CDATA[Jonathan Rogers ]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[rogers.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/rogers.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/rogers.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/rogers.jpg?itok=JMuy-ZqL]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Jonathan Rogers]]></image_alt>                    <created>1519671071</created>          <gmt_created>2018-02-26 18:51:11</gmt_created>          <changed>1519671071</changed>          <gmt_changed>2018-02-26 18:51:11</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.news.gatech.edu/features/creating-next-robotics]]></url>        <title><![CDATA[Tarzan Robot: The Future of Farming]]></title>      </link>          <link>        <url><![CDATA[http://ireal.gatech.edu/]]></url>        <title><![CDATA[Jonathan Rogers' Lab]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>          <group id="108731"><![CDATA[School of Mechanical Engineering]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="34141"><![CDATA[Drones]]></keyword>          <keyword tid="177227"><![CDATA[solider]]></keyword>          <keyword tid="114051"><![CDATA[Jonathan Rogers]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="601678">  <title><![CDATA[Neurons Get the Beat and Keep It Going in Drumrolls]]></title>  <uid>31759</uid>  <body><![CDATA[<p>A neuron firing deep in the brain might sound a little like: Drumroll&hellip;cymbal crash! Drumroll&hellip;cymbal crash! Repeat. With emphasis on &ldquo;repeat,&rdquo; <a href="http://www.jneurosci.org/content/early/2017/12/26/JNEUROSCI.1519-17.2017" target="_blank">according to a new study.</a></p><p>What used to look like fleeting cacophonies of electrical impulses in the brain is looking to neuroscience researchers more and more like a sustained matrix of electronic percussion. For years, they have been analyzing patterns hidden in neurons&rsquo; electrical buzzes, and now, they have revealed in neurons continued stretches of orderly drumroll-like rumblings speckled with thrashing impulses, or spikes, that stimulate neighboring neurons.</p><p>&ldquo;These signaling patterns last a lot longer than we thought,&rdquo; said <a href="http://singer.gatech.edu/lab/" target="_blank">Annabelle Singer, an assistant professor at the Georgia Institute of Technology</a>. Singer led the <em>in vivo</em> study on mice together with <a href="http://syntheticneurobiology.org/" target="_blank">Ed Boyden, a professor at the Massachusetts Institute of Technology</a>.</p><h4><strong>Persistent neurons</strong></h4><p>&ldquo;We used to think that neurons would fire spikes to neighboring neurons for a few milliseconds, and that was all it would take to make the next neuron spike,&rdquo; Singer said. &ldquo;Now we&rsquo;re seeing that you get these repeating patterns of rumblings and spikes sustained over hundreds of milliseconds, even close to a full second.&rdquo;</p><p>That&rsquo;s about how long it takes a human heart to complete one full beat.</p><p>The rumblings are jumbly fluctuations of electrical potential within a neuron before it fires a spike. The spikes are big electrical signals that communicate with neighboring neurons.</p><p>Taken together, the sum of the spikes in the brain make its circuitry compute so that we can walk, talk, and live life.</p><p>The researchers <a href="http://www.jneurosci.org/content/early/2017/12/26/JNEUROSCI.1519-17.2017" target="_blank">published their study on the newly discovered patterns in the <em>Journal of Neuroscience</em></a>. Official publication date is February 14, 2018, but the study is already available online without embargo. The research was funded by the National Institutes of Health, the National Science Foundation, the Friends of the McGovern Institute, the New York Stem Cell Foundation, the MIT Intelligence Initiative, and the Lane Family.</p><h4><strong>Questions and answers</strong></h4><p>The combination of observing the patterns&rsquo; percussion-like characteristics as well as their sustained lengths in the brains of awake mice make this a novel finding, Singer said. Some similar previous studies have been performed on mice that were anesthetized, which strongly altered brain activity when compared to awake brains.</p><p>Here are some questions and answers about the observed patterns and their significance.</p><h4><strong>What do these sustained patterns look like?</strong></h4><p>The researchers recorded the activities of individual neurons in the hippocampus, which is located in the lower center of the brain, with a robotic device called a <a href="http://www.rh.gatech.edu/news/583105/robotic-cleaning-technique-could-automate-neuroscience-research" target="_blank">patch clamp</a>. It&rsquo;s a hollow glass needle one micron&nbsp;in diameter that latches onto a single neuron via suction and measures its electrical activity.</p><p>The researchers observed electrical rumblings, symbolized here by a drumroll. And they observed spikes, symbolized here by a cymbal crash.</p><p>Though the pattern of rumblings wasn&rsquo;t uniform, it rose and fell like a drumroll undulating between softer and louder volumes. Spikes occurred much more rarely than drumbeats, but with notable timing.</p><p>&ldquo;The spikes repeated in the same spots with high precision, so they weren&rsquo;t just random,&rdquo; Singer said. &ldquo;They came around the peaks of rumblings, not always right on top of a peak but within a hair of it.&rdquo;</p><p>It would be like a cymbal crash hitting not every time, but every few times the undulating drumroll topped a volume peak. And the drumroll-cymbal-crash patterns sustained themselves for surprisingly long periods.</p><p>&ldquo;The time periods of activity that was structured like this were much longer than we expected,&rdquo; Singer said. &ldquo;People have shown sustained periods of signaling like this for 100 to 300 milliseconds before, but this appears to be the first time it&rsquo;s been seen for 900 milliseconds (nearly a full second), and it may go on even longer.&rdquo;</p><h4><strong>What are neurons doing with these rumblings and spikes?</strong></h4><p>When one neuron fires a spike, that electronic impulse hits neighboring neurons and influences the receiving neurons&rsquo; rumblings until they fire spikes, too.</p><p>&ldquo;A neuron receives these fast inputs. There are many different drumbeat patterns coming from many different neurons around it,&rdquo; Singer said. &ldquo;The patterns we observed in one neuron were being driven by other neurons firing into it like a whole drum section with short little bursts.&rdquo;</p><p>At first sight, that may appear to be a cacophony, but if the jumbly patterns repeat, a consistent percussion of rumblings in the neuron may result.</p><h4><strong>How may this influence the way we picture neurons at work?</strong></h4><p>&ldquo;I think people have thought about neuron firings as random then suddenly organized in a concerted kind of way,&rdquo; Singer said.</p><p>That could be pictured as many neurons behaving spastically until it was time to get to work, then abruptly firing as a group in near unison. This does appear to happen under the right circumstances, but as a prevailing picture of neuron firing, &nbsp;it may be lacking something.</p><p>&ldquo;We&rsquo;re starting to see more structure, very complex structure in what was thought to be randomness,&rdquo; Singer said. &ldquo;There is a lot of activity that is ongoing that is organized and that we need to understand, as well.&rdquo;</p><p>The researchers examined cells important for memory, but further research will be required to know what role the observed firing patterns may have in its function. The researchers are also working together with engineers at Georgia Tech to develop new robotic patch clamping devices that listen simultaneously to the firings of neurons connected to one another.</p><p><a href="http://www.rh.gatech.edu/features/cosmos-cranium" target="_blank">Also READ our feature on&nbsp;neurology research: The Brain, Cosmos in the Cranium&nbsp;</a></p><p>Like this article? <a href="http://www.rh.gatech.edu/subscribe" target="_blank">Get our email newsletter here.</a></p><p><em>These researchers also collaborated on the study: Craig Forest, Ilya Kolb, and Michael Wang of Georgia Tech; Giovanni Talei Franzesi, and Edward S. Boyden of MIT, and Suhasa Kodandaramaiah previously at Georgia Tech and MIT and now at the University of Minnesota. The research was funded by the following of the National Institutes of Health sources: Computational Neuroscience Training (grant DA032466-02), a Director&rsquo;s Pioneer Award (1DP1NS087724), a Transformative Award (1R01MH103910), and further NIH grants (1R01EY023173, 1R01NS067199, 1R01DA029639, 1U01MH106027 and 5R44NS08310803). It was also funded by the Cognitive Rhythms Collaborative, which is funded by the National Science Foundation&rsquo;s Division of Mathematical Science (grant 10421134), and funding also came from the MIT Intelligence Initiative, the Lane Family, and the Friends of the McGovern Institute.</em></p><p><em><strong>DOI:</strong>&nbsp;</em>10.1523/JNEUROSCI.1519-17.2017&nbsp;</p>]]></body>  <author>Ben Brumfield</author>  <status>1</status>  <created>1517423328</created>  <gmt_created>2018-01-31 18:28:48</gmt_created>  <changed>1517940349</changed>  <gmt_changed>2018-02-06 18:05:49</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Some of what researchers believed to be chaotic electric potentials in neurons are turning out the be surprisingly orderly.]]></teaser>  <type>news</type>  <sentence><![CDATA[Some of what researchers believed to be chaotic electric potentials in neurons are turning out the be surprisingly orderly.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2018-02-01T00:00:00-05:00</dateline>  <iso_dateline>2018-02-01T00:00:00-05:00</iso_dateline>  <gmt_dateline>2018-02-01 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[ben.brumfield@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia 30332-0181&nbsp; USA</strong></p><p><strong>Writer:&nbsp;</strong>Ben Brumfield</p><p>@benbgatech&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>601671</item>          <item>601674</item>          <item>601670</item>          <item>601669</item>          <item>601675</item>          <item>583097</item>      </media>  <hg_media>          <item>          <nid>601671</nid>          <type>image</type>          <title><![CDATA[Healthy neuron illustration NIA/NIH]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[healthy neuron NIH.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/healthy%20neuron%20NIH.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/healthy%20neuron%20NIH.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/healthy%2520neuron%2520NIH.jpg?itok=MdntibHg]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1517421149</created>          <gmt_created>2018-01-31 17:52:29</gmt_created>          <changed>1517421149</changed>          <gmt_changed>2018-01-31 17:52:29</gmt_changed>      </item>          <item>          <nid>601674</nid>          <type>image</type>          <title><![CDATA[Annabelle Singer in her BME lab]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Annabelle.sm_.file_.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Annabelle.sm_.file_.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Annabelle.sm_.file_.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Annabelle.sm_.file_.jpg?itok=sVUMoj5n]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1517421911</created>          <gmt_created>2018-01-31 18:05:11</gmt_created>          <changed>1517421911</changed>          <gmt_changed>2018-01-31 18:05:11</gmt_changed>      </item>          <item>          <nid>601670</nid>          <type>image</type>          <title><![CDATA[Synapse illustration with messenger molecules and neurons]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Synapse.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Synapse.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Synapse.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Synapse.jpg?itok=rfPQ63K-]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1517420529</created>          <gmt_created>2018-01-31 17:42:09</gmt_created>          <changed>1517420529</changed>          <gmt_changed>2018-01-31 17:42:09</gmt_changed>      </item>          <item>          <nid>601669</nid>          <type>image</type>          <title><![CDATA[Patch clamp diagram]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[1-17-cosmos-patch-clamp.gif]]></image_name>            <image_path><![CDATA[/sites/default/files/images/1-17-cosmos-patch-clamp.gif]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/1-17-cosmos-patch-clamp.gif]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/1-17-cosmos-patch-clamp.gif?itok=rEd6a48r]]></image_740>            <image_mime>image/gif</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1517420267</created>          <gmt_created>2018-01-31 17:37:47</gmt_created>          <changed>1517420267</changed>          <gmt_changed>2018-01-31 17:37:47</gmt_changed>      </item>          <item>          <nid>601675</nid>          <type>image</type>          <title><![CDATA[Craig Forest in his IBB lab]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[1-17-cosmos-forest.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/1-17-cosmos-forest.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/1-17-cosmos-forest.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/1-17-cosmos-forest.jpg?itok=nq5Ptj7v]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1517422081</created>          <gmt_created>2018-01-31 18:08:01</gmt_created>          <changed>1517422081</changed>          <gmt_changed>2018-01-31 18:08:01</gmt_changed>      </item>          <item>          <nid>583097</nid>          <type>image</type>          <title><![CDATA[Patch-clamping equipment3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[patch-clamp4251.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/patch-clamp4251.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/patch-clamp4251.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/patch-clamp4251.jpg?itok=lFey35MI]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Patch-clamping setup]]></image_alt>                    <created>1477419228</created>          <gmt_created>2016-10-25 18:13:48</gmt_created>          <changed>1477419228</changed>          <gmt_changed>2016-10-25 18:13:48</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="32691"><![CDATA[patch clamp]]></keyword>          <keyword tid="12333"><![CDATA[Craig Forest]]></keyword>          <keyword tid="176963"><![CDATA[self-cleaning patch clamp]]></keyword>          <keyword tid="176966"><![CDATA[multiclamper]]></keyword>          <keyword tid="7276"><![CDATA[neuron]]></keyword>          <keyword tid="176956"><![CDATA[action potential]]></keyword>          <keyword tid="176964"><![CDATA[neuron rumbling]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="599489">  <title><![CDATA[College of Computing Selects Ayanna Howard to Lead School of Interactive Computing]]></title>  <uid>33939</uid>  <body><![CDATA[<p>Following a national search, the Georgia Tech College of Computing has selected <a href="https://en.wikipedia.org/wiki/Ayanna_Howard"><strong>Ayanna Howard</strong></a>, professor and Linda J. and Mark C. Smith Chair in the School of Electrical and Computer Engineering (ECE) to chair its <a href="http://ic.gatech.edu">School of Interactive Computing</a>.</p><p>Howard, who is also associate chair for faculty development in ECE, will succeed Professor <a href="https://www.cc.gatech.edu/people/annie-anton"><strong>Annie Ant&oacute;n</strong></a>, who served in the role from 2012-17. Ant&oacute;n finished her five-year term in June 2017 and remains a professor within the school. Professor <a href="https://www.cc.gatech.edu/people/amy-bruckman"><strong>Amy Bruckman</strong></a> has served as the interim chair since July.</p><p>&ldquo;Ayanna Howard is the perfect individual to lead our School of Interactive Computing, and we are excited to welcome her to the College,&rdquo; said <a href="https://www.cc.gatech.edu/people/zvi-galil"><strong>Zvi Galil</strong></a>, John P. Imlay Jr. Dean of Computing. &ldquo;She brings a wealth of experience in research and administration, and she has consistently succeeded in leadership opportunities both inside and outside Georgia Tech. Her vision and energy will help ensure that IC will continue to be a national leader in computing research and education.&rdquo;</p><p>As a testimony to her interdisciplinary focus, Howard has collaborated with a number of IC researchers in the past and said the she is looking forward to fostering new &ndash; and fruitful &ndash; &nbsp;relationships with the school&rsquo;s faculty and staff.</p><p>&ldquo;I am thrilled for the opportunity to work with the amazing faculty, staff, and students within the School of Interactive Computing,&rdquo; Howard said. &ldquo;They are already national leaders in some of the most important fields of modern computing, and I look forward to building on that foundation and continuing to pursue research and innovation that addresses real challenges facing our world today.&rdquo;</p><p>Howard received her bachelor&rsquo;s degree in engineering from Brown University, her master&rsquo;s degree in electrical engineering from the University of Southern California, and her Ph.D. in electrical engineering from the University of Southern California, Los Angeles, in 1999.</p><p>Her research is highlighted by her focus on technology development for intelligent agents that must interact with and in a human-centered world. This work, which addresses issues of human-robot interaction, learning, and autonomous control, has resulted in more than 200 peer-reviewed publications. To date, her accomplishments have been highlighted through a number of awards and articles, including highlights in <em>Time</em>, <em>Black Enterprise</em>, and <em>USA Today</em>. She was named an <em>MIT Technology Review</em> top young innovator and recognized as one of the 23 most powerful women engineers in the world by <em>Business Insider</em>.</p><p>She has more than 20 years of research and development experience covering a number of projects that have been supported by organizations like the National Science Foundation, Procter and Gamble, NASA, ExxonMobil, Intel, and the Grammy Foundation.</p><p>Howard is the director of the Human-Automation Systems Lab (HumAnS), and in 2015 founded a $3 million traineeship initiative in health care robotics. In 2013, she founded <a href="https://zyrobotics.com/">Zyrobotics</a> as a university spin-off and holds a position in the company as chief technology officer. Zyrobotics is currently licensing technology derived from her research and has released its first suite of mobile therapy and educational products for children with differing needs.</p><p>From 1993-2005, Howard worked at NASA&rsquo;s Jet Propulsion Laboratory, where she was a senior robotics researcher and deputy manager in the Office of the Chief Scientist. She has also served as the associate director of research for Georgia Tech&rsquo;s Institute for Robotics and Intelligent Machines and as chair of the multidisciplinary robotics Ph.D. program at Georgia Tech.</p><p>Howard will assume her new role in January 2018. Her appointment is contingent upon approval by Georgia Tech President <strong>G.P. &ldquo;Bud&rdquo; Peterson</strong> and the Board of Regents of the University System of Georgia. She will retain her current Linda J. and Mark C. Smith endowment after transitioning to the School of Interactive Computing.</p>]]></body>  <author>David Mitchell</author>  <status>1</status>  <created>1512406243</created>  <gmt_created>2017-12-04 16:50:43</gmt_created>  <changed>1512413747</changed>  <gmt_changed>2017-12-04 18:55:47</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Following a national search, the Georgia Tech College of Computing has selected Ayanna Howard, professor and Linda J. and Mark C. Smith Chair in the School of Electrical and Computer Engineering (ECE) to chair its School of Interactive Computing.]]></teaser>  <type>news</type>  <sentence><![CDATA[Following a national search, the Georgia Tech College of Computing has selected Ayanna Howard, professor and Linda J. and Mark C. Smith Chair in the School of Electrical and Computer Engineering (ECE) to chair its School of Interactive Computing.]]></sentence>  <summary><![CDATA[<p>Ayanna Howard named chair of School of Interactive Computing.</p>]]></summary>  <dateline>2017-12-04T00:00:00-05:00</dateline>  <iso_dateline>2017-12-04T00:00:00-05:00</iso_dateline>  <gmt_dateline>2017-12-04 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[david.mitchell@cc.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>David Mitchell</p><p><a href="mailto:david.mitchell@cc.gatech.edu">david.mitchell@cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>599491</item>          <item>599486</item>      </media>  <hg_media>          <item>          <nid>599491</nid>          <type>image</type>          <title><![CDATA[Ayanna Howard]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[howard_3.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/howard_3.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/howard_3.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/howard_3.jpg?itok=1U8x2GTJ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1512406963</created>          <gmt_created>2017-12-04 17:02:43</gmt_created>          <changed>1512406963</changed>          <gmt_changed>2017-12-04 17:02:43</gmt_changed>      </item>          <item>          <nid>599486</nid>          <type>image</type>          <title><![CDATA[Ayanna Howard headshot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Howard 2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Howard%202.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Howard%202.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Howard%25202.jpg?itok=FKpjBeR-]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ayanna Howard]]></image_alt>                    <created>1512405411</created>          <gmt_created>2017-12-04 16:36:51</gmt_created>          <changed>1512405411</changed>          <gmt_changed>2017-12-04 16:36:51</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://robotics.gatech.edu]]></url>        <title><![CDATA[Institute for Robotics and Intelligent Machines]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="134"><![CDATA[Student and Faculty]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="825"><![CDATA[Ayanna Howard]]></keyword>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="166848"><![CDATA[School of Interactive Computing]]></keyword>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="27641"><![CDATA[annie anton]]></keyword>          <keyword tid="8472"><![CDATA[amy bruckman]]></keyword>          <keyword tid="22401"><![CDATA[G. P. Bud Peterson]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="597463">  <title><![CDATA[Army Grant Supports Development of Intelligent, Adaptive and Resilient Robot Teams]]></title>  <uid>27303</uid>  <body><![CDATA[<p>The <a href="https://www.arl.army.mil/www/default.cfm">U.S. Army Research Laboratory</a> has awarded an alliance headed by the University of Pennsylvania a five-year, $27 million grant to develop new methods of creating autonomous, intelligent and resilient teams of robots.&nbsp;</p><p>These teams, consisting of multiple types of robots and sensors with varying abilities, are designed to assist humans in a wide range of missions in dynamically changing, harsh and contested environments. These include search and rescue of hostages, information gathering after terrorist attacks or natural disasters, and humanitarian missions.&nbsp;</p><p>The award is part of ARL&rsquo;s Distributed and Collaborative Intelligent Systems and Technology (DCIST) Collaborative Research Alliance. Penn Engineering will lead this alliance in collaboration with the Army Research Laboratory, Massachusetts Institute of Technology&rsquo;s Aeronautics and Astronautics Department, and the Georgia Institute of Technology. The consortium also includes faculty from University of California San Diego, University of California Berkeley and University of Southern California.</p><p>DCIST involves imbuing teams of heterogeneous robots and sensors with the intelligence to learn and adapt to different settings and perform new tasks along with humans. Key to this vision is building resilience to disruption.&nbsp;</p><p>Teams of robots and human first responders might eventually be used to survey a disaster site for victims, but unpredictable environments and ongoing hazards could damage or destroy some of the robots, or disrupt communications between them. If each robot were just preprogrammed and given specific instructions, that could lead to gaps in their search. But if the team were able to reconfigure itself in response to damage, the remaining robots could collaboratively decide how to reorganize and work with human partners to complete the mission.&nbsp;</p><p>&ldquo;We want to have teams of robots that know how to work together, but can figure out how to keep working even if some of their teammates crash or fail, if GPS signal is unavailable, or if cloud services are disrupted,&rdquo; said Vijay Kumar, Penn Engineering&rsquo;s Nemirovsky Family Dean and director for the DCIST program. &ldquo;This means designing networks with loose, flexible connections that can change on the fly. That way, a single event can&rsquo;t bring down the entire network. More importantly, we want them to learn to perform tasks they may have never performed and work alongside humans that they may never have worked with.&rdquo;&nbsp;&nbsp;</p><p>The three important research focus areas are distributed intelligence and learning; creating a cohesive team of autonomous robots, sensors, computational resources and human experts; and building resiliency in group behaviors.&nbsp;</p><p>&ldquo;Through this exciting project, Georgia Tech will help develop novel tools and techniques that enable human operators to work effectively and safely in teams together with autonomous robots,&rdquo; said <a href="https://www.ece.gatech.edu/faculty-staff-directory/magnus-egerstedt-0">Magnus Egerstedt</a>, executive director of Georgia Tech&rsquo;s <a href="http://www.robotics.gatech.edu/">Institute for Robotics and Intelligent Machines</a> and Julian T. Hightower Chair in Systems and Controls. &ldquo;These types of questions connect well with our&nbsp;expertise in the areas of human-robot interactions, distributed decision making and learning, and swarm robotics.&rdquo;</p><p>Beyond Egerstedt, the Georgia Tech researchers affiliated with this multidisciplinary project are <a href="https://www.cc.gatech.edu/people/sonia-chernova">Sonia Chernova</a>, assistant professor in the School of Interactive Computing; <a href="https://www.aerospace.gatech.edu/people/panagiotis-tsiotras">Panagiotis Tsiotras</a>, Dean&rsquo;s Professor in the School of Aerospace Engineering; and <a href="https://www.ece.gatech.edu/faculty-staff-directory/justin-romberg">Justin Romberg</a>, Associate Chair for Research and Schlumberger Professor in the School of Electrical and Computer Engineering.</p><p>With multiple types of assets collectively assessing a complex, continuously changing scenario and determining how best to assign their individual skills to a broadly defined problem, such human-robot teams of the future would be ideal first-responders to dangerous situations.</p><p>&ldquo;The technology we&rsquo;re working will better allow humans to respond by projecting their intelligence without directly coming in harm&rsquo;s way,&rdquo; Kumar said.&nbsp;&nbsp;</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181 USA</strong></p><p><strong>Media Relations Contacts</strong>: Georgia Tech &ndash; John Toon (404-894-6986) (jtoon@gatech.edu); UPenn &ndash; Evan Lerner (215-573-6604) (elerner@upenn.edu).</p><p><em><strong>Provided by Army Research Laboratory</strong></em></p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1508179764</created>  <gmt_created>2017-10-16 18:49:24</gmt_created>  <changed>1508182652</changed>  <gmt_changed>2017-10-16 19:37:32</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The U.S. Army Research Laboratory has awarded a $27 million grant to develop new methods of creating robot teams.]]></teaser>  <type>news</type>  <sentence><![CDATA[The U.S. Army Research Laboratory has awarded a $27 million grant to develop new methods of creating robot teams.]]></sentence>  <summary><![CDATA[<p>The U.S. Army Research Laboratory has awarded an alliance headed by the University of Pennsylvania a five-year, $27 million grant to develop new methods of creating autonomous, intelligent and resilient teams of robots.&nbsp;</p>]]></summary>  <dateline>2017-10-16T00:00:00-04:00</dateline>  <iso_dateline>2017-10-16T00:00:00-04:00</iso_dateline>  <gmt_dateline>2017-10-16 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>597469</item>          <item>597470</item>      </media>  <hg_media>          <item>          <nid>597469</nid>          <type>image</type>          <title><![CDATA[Sonia Chernova & Army research grant]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[sonia-chernova.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/sonia-chernova.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/sonia-chernova.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/sonia-chernova.jpg?itok=KGaCCiNG]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sonia Chernova, Georgia Tech]]></image_alt>                    <created>1508182097</created>          <gmt_created>2017-10-16 19:28:17</gmt_created>          <changed>1508182097</changed>          <gmt_changed>2017-10-16 19:28:17</gmt_changed>      </item>          <item>          <nid>597470</nid>          <type>image</type>          <title><![CDATA[Magnus Egerstedt & Army research grant]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[robotarium-magnus-georgia-tech.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/robotarium-magnus-georgia-tech.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/robotarium-magnus-georgia-tech.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/robotarium-magnus-georgia-tech.jpg?itok=h3-MB2Od]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Magnus Egerstedt in Robotarium]]></image_alt>                    <created>1508182168</created>          <gmt_created>2017-10-16 19:29:28</gmt_created>          <changed>1508182168</changed>          <gmt_changed>2017-10-16 19:29:28</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="2352"><![CDATA[robots]]></keyword>          <keyword tid="169029"><![CDATA[swarm robots]]></keyword>          <keyword tid="175928"><![CDATA[robot teams]]></keyword>          <keyword tid="11528"><![CDATA[Magnus Egerstedt]]></keyword>          <keyword tid="169047"><![CDATA[Sonia Chernova]]></keyword>      </keywords>  <core_research_areas>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="596207">  <title><![CDATA[Running Roaches, Flapping Moths Create a New Physics of Organisms]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Sand-swimming lizards, slithering robotic snakes, dusk-flying moths and running roaches all have one thing in common: They&#39;re increasingly being studied by physicists interested in understanding the shared strategies these creatures have developed to overcome the challenges of moving though their environments.</p><p>By analyzing the rules governing the locomotion of these creatures, &quot;physics of living systems&quot; researchers are learning how animals successfully negotiate unstable surfaces like wet sand, maintain rapid motion on flat surfaces using the advantageous mechanics of their bodies, and fly in ways that would never work for modern aircraft. The knowledge these researchers develop could be useful to the designers of robots and flying vehicles of all kinds.</p><p>&ldquo;Locomotion is a very natural access point for understanding how biological systems interact with the world,&rdquo; said <a href="http://www.physics.gatech.edu/user/simon-sponberg">Simon Sponberg</a>, an assistant professor in the <a href="http://www.physics.gatech.edu">School of Physics</a> and <a href="http://www.biosciences.gatech.edu/">School of Biological Sciences</a> at the Georgia Institute of Technology.&nbsp;&ldquo;When they move, animals change the environment around them so they can push off from it and move through it in different ways. This capability is a defining feature of animals.&rdquo;</p><p>Sponberg has spent his career bridging the gap between physics and organismal biology &ndash; the study of complex creatures. His work includes studying how hawk moths slow their nervous systems to maintain vision during low-light conditions, and how muscle is a versatile material able to change function from a brake to a motor or spring.</p><p>He recently published a feature article, the cover story for the September issue of the American Institute of Physics magazine <a href="http://physicstoday.scitation.org/doi/10.1063/PT.3.3691"><em>Physics Today</em></a>, on the role of physics in animal locomotion. The article was not intended as a review of the entire field, but rather to show how organismal physics &ndash; integrating complex physiological systems, the mechanics and the surrounding environment into a whole animal &ndash; has inspired his career.</p><p>&ldquo;The intersection of physics and organismal biology is a very exciting one right now,&rdquo; said Sponberg, who is also a researcher with the <a href="http://petitinstitute.gatech.edu/">Petit Institute for Bioengineering and Bioscience</a> at Georgia Tech said. &ldquo;The assembly and interaction of multiple natural components manifests new behaviors and dynamics. The collection of these natural components manifests different patterns than the individual parts, and that&rsquo;s fascinating.&rdquo;</p><p>Supported by new initiatives at such organizations as the <a href="http://www.arl.army.mil">Army Research Office/Army Research Laboratory</a> and the <a href="http://www.nsf.gov">National Science Foundation</a> &ndash; which are embracing these frontiers &ndash; Georgia Tech scientists are learning the equations that dictate how snakes move, understanding how the hair spacing on the bodies of bees help them stay clean, and using X-ray equipment to see how an unusual African lizard &ldquo;swims&rdquo; through dry sand.</p><p>&ldquo;It&rsquo;s a really exciting time to be working at the intersection of evolutionary organismal biology that is realized in these living systems that have come about through the process of evolution, composed of seemingly very complex systems,&rdquo; he said. &ldquo;Biological systems are inescapably complex, but that doesn&rsquo;t mean there aren&rsquo;t simple patterns of behavior that we can understand. We now have the modern tools, approaches and theory that we need to be able to extract physical patterns from biological systems.&rdquo;</p><p>In his article, Sponberg makes predictions about the research that will be needed for the physics of living systems to advance as a field:</p><ul><li>How feedback transforms physiological dynamics,</li><li>How aggregations of living components, from humans to ants to molecular motors, arise at multiple scales, and</li><li>How robo-physical models of these complex systems can lead to new discoveries and advance engineering.</li></ul><p>Engineered systems use feedback about the effects of their actions to adjust their future activities, and animals do the same to control their movement. Scientists can manipulate this feedback to understand how complex systems are put together and use the feedback to design experiments rather than just analyzing what is there.&nbsp;</p><p>&ldquo;We use feedback all the time to move through our environment, and feedback is a really special thing that fundamentally affects how dynamics occur,&rdquo; said Sponberg. &ldquo;But using feedback to design experiments is really sort of new.&rdquo;</p><p>For example, in the study of how hawk moths track flowers during low-light conditions, he and his colleagues used feedback dynamics to isolate how the moth&rsquo;s brain adjusts its processing in dim light. The moths can still accurately track flower movements that occur less than two times per second &ndash; which matches the frequency at which the flowers sway in the wind.</p><p>Animals are composed of many systems operating at multiple time scales simultaneously &ndash; brain neurons, nerves and the individual fibers of muscles with molecular motors. These muscle fibers are arranged in an active crystalline lattice such that X-rays fired through them create a regular diffraction pattern. Understanding these multiscale living assemblages provides new insights into how animals manage complex actions.</p><p>Finally, Sponberg notes in his article that robots are playing a larger and larger role in the physics laboratory as functional models that can examine principles of movement by interacting with the real world. In the laboratory of Georgia Tech Associate Professor Dan Goldman &ndash; one of Sponberg&rsquo;s colleagues &ndash; robotic snakes, turtles, crabs and other creatures help scientists understand what they&rsquo;re observing in the natural world.</p><p>&ldquo;Moving physical models &ndash; robots &ndash; can be very powerful tools for understanding these complex systems,&rdquo; Sponberg said. &ldquo;They can allow us to do experiments on robots that we couldn&rsquo;t do on animals to see how they interact with complex environments. We can see what physics in these systems is essential to their behaviors.&rdquo;</p><p>Sponberg was inspired to study the interaction of organismal biology and physics by the remarkable diversity of animal movement and by nonlinear dynamics, a field made popular when he was a young student by the 1987 best-selling book <em>Chaos: Making a New Science,</em> authored by former New York Times reporter James Gleick. Sponberg hopes today&rsquo;s students &ndash; readers of <em>Physics Today</em> &ndash; will also be inspired.</p><p>&ldquo;I voted on this with my career choice, so I think this is a very exciting areas of science,&rdquo; he added.</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong></p><p><strong>Media Relations Contacts</strong>: John Toon (404-894-6986) (jtoon@gatech.edu) or Ben Brumfield (404-660-1408) (ben.brumfield@comm.gatech.edu).</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1505854063</created>  <gmt_created>2017-09-19 20:47:43</gmt_created>  <changed>1506545525</changed>  <gmt_changed>2017-09-27 20:52:05</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers are interested in the strategies creatures have developed to overcome the challenges of moving though their environments.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers are interested in the strategies creatures have developed to overcome the challenges of moving though their environments.]]></sentence>  <summary><![CDATA[<p>Sand-swimming lizards, slithering robotic snakes, dusk-flying moths and running roaches all have one thing in common: They&#39;re increasingly being studied by physicists interested in understanding the shared strategies these creatures have developed to overcome the challenges of moving though their environments.</p>]]></summary>  <dateline>2017-09-19T00:00:00-04:00</dateline>  <iso_dateline>2017-09-19T00:00:00-04:00</iso_dateline>  <gmt_dateline>2017-09-19 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>596193</item>          <item>596194</item>          <item>596196</item>      </media>  <hg_media>          <item>          <nid>596193</nid>          <type>image</type>          <title><![CDATA[Hawk moth on robotic flower2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[hawkmoth6.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/hawkmoth6_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/hawkmoth6_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/hawkmoth6_0.jpg?itok=LDCFxM_U]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Hawk moth landing on robotic flower]]></image_alt>                    <created>1505852306</created>          <gmt_created>2017-09-19 20:18:26</gmt_created>          <changed>1505852306</changed>          <gmt_changed>2017-09-19 20:18:26</gmt_changed>      </item>          <item>          <nid>596194</nid>          <type>image</type>          <title><![CDATA[Hawk moth on natural flower]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Manduca and flower.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Manduca%20and%20flower.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Manduca%20and%20flower.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Manduca%2520and%2520flower.jpg?itok=sb8qXjDh]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Hawk moth and natural flower]]></image_alt>                    <created>1505853283</created>          <gmt_created>2017-09-19 20:34:43</gmt_created>          <changed>1505853283</changed>          <gmt_changed>2017-09-19 20:34:43</gmt_changed>      </item>          <item>          <nid>596196</nid>          <type>image</type>          <title><![CDATA[Simon Sponberg and hawk moth]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[hawkmoth12.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/hawkmoth12.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/hawkmoth12.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/hawkmoth12.jpg?itok=kNP5FJez]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Simon Sponberg holds hawk moth]]></image_alt>                    <created>1505853417</created>          <gmt_created>2017-09-19 20:36:57</gmt_created>          <changed>1505853417</changed>          <gmt_changed>2017-09-19 20:36:57</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1278"><![CDATA[College of Sciences]]></group>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="175601"><![CDATA[haw moth]]></keyword>          <keyword tid="129701"><![CDATA[physics of living systems]]></keyword>          <keyword tid="175602"><![CDATA[living systems]]></keyword>          <keyword tid="960"><![CDATA[physics]]></keyword>          <keyword tid="377"><![CDATA[locomotion]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="594831">  <title><![CDATA[Georgia Tech Opens Robotics Lab to the World]]></title>  <uid>27560</uid>  <body><![CDATA[<p>The nation&rsquo;s first remote robotics lab, and the nearly 100 machines that call it home, is now open thanks to a little help from its friends.</p><p>The Robotarium held its grand opening on Tuesday in the Van Leer Building. Appropriately, a scissor-wielding robot (named Snips) cut the ribbon. Later, a researcher from the University of Illinois at Urbana-Champaign skyped into the room to run a live remote experiment.</p><p>&ldquo;It&rsquo;s time to begin a new era in robotics,&rdquo; said Magnus Egerstedt, the Julian T. Hightower Chair in Systems and Controls and a professor in the School of Electrical and Computer Engineering.</p><p>Egerstedt, who oversees the lab, was joined by President G.P. &ldquo;Bud&rdquo; Peterson, other cabinet members and more than a dozen congressional staffers who were on campus to learn more about Georgia Tech research and initiatives.&nbsp;</p><p>The Robotarium is a $2.5 million facility funded by the National Science Foundation and Office of Naval Research. It allows researchers around the world to upload their own code, then have Georgia Tech&rsquo;s rolling and flying swarm robots perform the experiment. Afterwards the researcher is sent data and video.&nbsp;</p><p>Egerstedt, executive director of Georgia Tech&#39;s Institute for Robotics and Intelligent Machines, dreamed up the lab about two years ago. It&rsquo;s expensive to build and maintain robots, let alone an entire robotics facility. He wanted more people to have access.<br /><br />&ldquo;It irritated me, and it still does, that robotics research is largely a resource competition and not a &lsquo;who has the best ideas&rsquo; competition,&rdquo; he said. &ldquo;The Robotarium is solving that. If you have a good idea, you should have a platform to try it.&rdquo;</p><p>Hundreds of students visited the lab after the ribbon-cutting ceremony for an open house. The Robotarium team conducted several experiments, including flying quadcopters able to change formation without crashing into each other.</p><p>Researchers can upload their programs and run experiments for free by visiting <a href="http://www.robotarium.org">www.robotarium.org</a>.<br /><br /><a href="https://www.dropbox.com/sh/8zggs41l9vip95f/AACVlYDyGJJBdwmcIEYxUkxqa?dl=0">See photos from the ribbon cutting. </a></p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1503494465</created>  <gmt_created>2017-08-23 13:21:05</gmt_created>  <changed>1503496051</changed>  <gmt_changed>2017-08-23 13:47:31</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The nation’s first remote robotics lab, the Robotarium, opens. ]]></teaser>  <type>news</type>  <sentence><![CDATA[The nation’s first remote robotics lab, the Robotarium, opens. ]]></sentence>  <summary><![CDATA[<p>The Robotarium held its grand opening on Tuesday in the Van Leer Building. Appropriately, a scissor-wielding robot (named Snips) cut the ribbon. Later, a researcher from the University of Illinois at Urbana-Champaign skyped into the room to run a live remote experiment.</p>]]></summary>  <dateline>2017-08-23T00:00:00-04:00</dateline>  <iso_dateline>2017-08-23T00:00:00-04:00</iso_dateline>  <gmt_dateline>2017-08-23 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Robot cuts a ribbon to unveil the Robotarium]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />National Media Relations<br />maderer@gatech.edu<br />404-660-2926</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>594829</item>          <item>594835</item>          <item>594834</item>      </media>  <hg_media>          <item>          <nid>594829</nid>          <type>image</type>          <title><![CDATA[Robotarium Ribbon Cutting VIPs]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[RIBBON CUTTING ROBOTARIUM DSC_6687.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/RIBBON%20CUTTING%20ROBOTARIUM%20DSC_6687.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/RIBBON%20CUTTING%20ROBOTARIUM%20DSC_6687.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/RIBBON%2520CUTTING%2520ROBOTARIUM%2520DSC_6687.jpg?itok=v_Cy7gfx]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ribbon Cutting Ceremony]]></image_alt>                    <created>1503493994</created>          <gmt_created>2017-08-23 13:13:14</gmt_created>          <changed>1503493994</changed>          <gmt_changed>2017-08-23 13:13:14</gmt_changed>      </item>          <item>          <nid>594835</nid>          <type>image</type>          <title><![CDATA[Robotarium in Action]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[watching robots.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/watching%20robots.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/watching%20robots.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/watching%2520robots.jpg?itok=V7kWIaHu]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Robotarium demo]]></image_alt>                    <created>1503495418</created>          <gmt_created>2017-08-23 13:36:58</gmt_created>          <changed>1503495439</changed>          <gmt_changed>2017-08-23 13:37:19</gmt_changed>      </item>          <item>          <nid>594834</nid>          <type>image</type>          <title><![CDATA[Robotarium Ribbon Cutting]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Snips cuts ribbon.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Snips%20cuts%20ribbon.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Snips%20cuts%20ribbon.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Snips%2520cuts%2520ribbon.jpg?itok=cY4D9UTB]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ribbon Cutting Ceremony]]></image_alt>                    <created>1503494953</created>          <gmt_created>2017-08-23 13:29:13</gmt_created>          <changed>1503494953</changed>          <gmt_changed>2017-08-23 13:29:13</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.news.gatech.edu/features/robotarium-robotics-lab-accessible-all]]></url>        <title><![CDATA[How the Robotarium was Created]]></title>      </link>          <link>        <url><![CDATA[https://www.youtube.com/watch?time_continue=1&amp;v=W68BmRtUNlw]]></url>        <title><![CDATA[Watch the Robots (Video)]]></title>      </link>          <link>        <url><![CDATA[http://www.robotics.gatech.edu/]]></url>        <title><![CDATA[Institute for Robotics and Intelligent Machines]]></title>      </link>          <link>        <url><![CDATA[https://t.co/TQfGTD8B8f]]></url>        <title><![CDATA[The Wall Street Journal's Front Page Story on the Lab]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>          <group id="142761"><![CDATA[IRIM]]></group>          <group id="1255"><![CDATA[School of Electrical and Computer Engineering]]></group>          <group id="1237"><![CDATA[College of Engineering]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="11528"><![CDATA[Magnus Egerstedt]]></keyword>          <keyword tid="78271"><![CDATA[IRIM]]></keyword>          <keyword tid="169814"><![CDATA[Robotarium]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="592731">  <title><![CDATA[Georgia Tech Researcher Provides Expert Testimony to Congress]]></title>  <uid>27918</uid>  <body><![CDATA[<p>Gary McMurray testified before a Congressional committee Thursday, offering expert testimony on the importance of agricultural funding, research and innovation.</p><p>McMurray leads the Food Processing Technology Division at the Georgia Tech Research Institute. He also develops advanced robotic systems for the food, transportation and biomedical industries.</p><p>He spoke before the Senate Committee on Agricultural, Nutrition and Forestry during a hearing titled &ldquo;Agricultural Research: Perspectives on Past and Future Success for the 2018 Farm Bill.&rdquo;</p><p>McMurray stressed the critical role agricultural research plays in meeting future food production demands. While great strides have been made, he said more work must be done.</p><p>&ldquo;Transformative innovation is needed,&rdquo; said McMurray, who is also associate director of Georgia Tech&rsquo;s Institute for Robotics and Intelligent Machines. &ldquo;Transformative innovation moves beyond just improving existing methods and processes to totally re-thinking systems development by creating entirely new systems.&rdquo;</p><p>He highlighted some of the work Georgia Tech is doing in conjunction with the University of Georgia to monitor crop health using autonomous systems.</p><p>For example, the institutions are developing ways for unmanned ground vehicles to work in conjunction with unmanned aerial vehicles to enable earlier detection of infected trees and plants and to identify the source of the problem so there can be more targeted intervention to prevent crop losses.</p><p>Read McMurray&rsquo;s complete testimony <a href="https://www.agriculture.senate.gov/hearings/agricultural-research-perspectives-on-past-and-future-successes-for-the-2018-farm-bill">here</a>.&nbsp;</p>]]></body>  <author>Laura Diamond</author>  <status>1</status>  <created>1497540766</created>  <gmt_created>2017-06-15 15:32:46</gmt_created>  <changed>1497558898</changed>  <gmt_changed>2017-06-15 20:34:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[GTRI’s Gary McMurray spoke about the 2018 Farm Bill, agricultural research before a Senate committee. ]]></teaser>  <type>news</type>  <sentence><![CDATA[GTRI’s Gary McMurray spoke about the 2018 Farm Bill, agricultural research before a Senate committee. ]]></sentence>  <summary><![CDATA[<p>Gary McMurray testified before&nbsp;the Senate Committee on Agricultural, Nutrition and Forestry Thursday.&nbsp; McMurray leads the Food Processing Technology Division at the Georgia Tech Research Institute.&nbsp;</p>]]></summary>  <dateline>2017-06-15T00:00:00-04:00</dateline>  <iso_dateline>2017-06-15T00:00:00-04:00</iso_dateline>  <gmt_dateline>2017-06-15 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[laura.diamond@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />National Media Relations<br />maderer@gatech.edu<br />404-660-2926&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>592730</item>          <item>592732</item>      </media>  <hg_media>          <item>          <nid>592730</nid>          <type>image</type>          <title><![CDATA[Gary McMurray headshot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tbl35227.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tbl35227_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tbl35227_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tbl35227_0.jpg?itok=lVS2KDM5]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1497540557</created>          <gmt_created>2017-06-15 15:29:17</gmt_created>          <changed>1497540557</changed>          <gmt_changed>2017-06-15 15:29:17</gmt_changed>      </item>          <item>          <nid>592732</nid>          <type>image</type>          <title><![CDATA[Gary McMurray Expert Testimony]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Gary McMurray.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Gary%20McMurray.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Gary%20McMurray.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Gary%2520McMurray.jpg?itok=tR7kCZwz]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1497541102</created>          <gmt_created>2017-06-15 15:38:22</gmt_created>          <changed>1497541102</changed>          <gmt_changed>2017-06-15 15:38:22</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="155"><![CDATA[Congressional Testimony]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="155"><![CDATA[Congressional Testimony]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="78271"><![CDATA[IRIM]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="669"><![CDATA[agriculture]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="106361"><![CDATA[Business and Economic Development]]></topic>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="592685">  <title><![CDATA[Robot Uses Deep Learning and Big Data to Write and Play its Own Music]]></title>  <uid>27560</uid>  <body><![CDATA[<p>A marimba-playing robot with four arms and eight sticks is writing and playing its own compositions in a lab at the Georgia Institute of Technology. The pieces are generated using artificial intelligence and deep learning.</p><p>Researchers fed the robot nearly 5,000 complete songs &mdash; from Beethoven to the Beatles to Lady Gaga to Miles Davis &mdash; and more than 2 million motifs, riffs and licks of music. Aside from giving the machine a seed, or the first four measures to use as a starting point, no humans are involved in either the composition or the performance of the music.</p><p>The first two compositions are roughly 30 seconds in length. The robot, named Shimon, can be seen and heard playing them <a href="https://www.youtube.com/watch?v=j82nYLOnKtM">here</a> and <a href="https://www.youtube.com/watch?v=6MSk5PP9KUA">here</a>.</p><p>Ph.D. student Mason Bretan is the man behind the machine. He&rsquo;s worked with Shimon for seven years, enabling it to &ldquo;listen&rdquo; to music played by humans and improvise over pre-composed chord progressions. Now Shimon is a solo composer for the first time, generating the melody and harmonic structure on its own.</p><p>&ldquo;Once Shimon learns the four measures we provide, it creates its own sequence of concepts and composes its own piece,&rdquo; said Bretan, who will receive his doctorate in music technology this summer at Georgia Tech. &ldquo;Shimon&rsquo;s compositions represent how music sounds and looks when a robot uses deep neural networks to learn everything it knows about music from millions of human-made segments.&rdquo;</p><p>Bretan says this is the first time a robot has used deep learning to create music. And unlike its days of improvising, when it played monophonically, Shimon is able to play harmonies and chords. It&rsquo;s also thinking much more like a human musician, focusing less on the next note, as it did before, and more on the overall structure of the composition. &nbsp;</p><p>&ldquo;When we play or listen to music, we don&rsquo;t think about the next note and only that next note,&rdquo; said Bretan. &ldquo;An artist has a bigger idea of what he or she is trying to achieve within the next few measures or later in the piece. Shimon is now coming up with higher-level musical semantics. Rather than thinking note by note, it has a larger idea of what it wants to play as a whole.&rdquo;</p><p>Shimon was created by Bretan&rsquo;s advisor, Gil Weinberg, director of Georgia Tech&rsquo;s Center for Music Technology.</p><p>&ldquo;This is a leap in Shimon&rsquo;s musical quality because it&rsquo;s using deep learning to create a more structured and coherent composition,&rdquo; said Weinberg, a professor in the School of Music. &ldquo;We want to explore whether robots could become musically creative and generate new music that we humans could find beautiful, inspiring and strange.&rdquo;</p><p>Shimon will create more pieces in the future. As long as the researchers feed it a different seed, the robot will produce something different each time &mdash; music that the researchers can&rsquo;t predict. In the first piece, Bretan fed Shimon a melody comprised of eighth notes. It received a sixteenth note melody the second time, which influenced it to generate faster note sequences.</p><p>Bretan acknowledges that he can&rsquo;t pick out individual songs that Shimon is referencing. He is able to recognize classical chord progression and influences of artists, such as Mozart, for example.<br /><br />&ldquo;They sound like a fusion of jazz and classical,&rdquo; said Bretan, who plays the keyboards and guitar in his free time. &ldquo;I definitely hear more classical, especially in the harmony. But then I hear chromatic moving steps in the first piece &mdash; that&rsquo;s definitely something you hear in jazz.&rdquo;</p><p>Shimon&rsquo;s debut as a solo composer was featured in a video clip in the Consumer Electronic Show (CES) keynote and will have its first live performance at the <a href="https://www.aspenideas.org/">Aspen Ideas Festival</a> at the end of June. It&rsquo;s the latest project within Weinberg&rsquo;s lab. He and his students have also created a <a href="http://www.news.gatech.edu/2014/03/05/robotic-prosthesis-turns-drummer-three-armed-cyborg">robotic prosthesis for a drummer</a>, a <a href="http://www.news.gatech.edu/2016/02/17/wearable-robot-transforms-musicians-three-armed-drummers">robotic third arm for all drummers</a>, and an <a href="https://www.youtube.com/watch?v=3ShaUMM0H-g">interactive robotic companion that plays music from a phone and dances to the beat</a>.</p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1497387497</created>  <gmt_created>2017-06-13 20:58:17</gmt_created>  <changed>1497387497</changed>  <gmt_changed>2017-06-13 20:58:17</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A marimba-playing robot with four arms and eight sticks is writing and playing its own compositions in the School of Music.]]></teaser>  <type>news</type>  <sentence><![CDATA[A marimba-playing robot with four arms and eight sticks is writing and playing its own compositions in the School of Music.]]></sentence>  <summary><![CDATA[<p>Researchers fed a robot nearly 5,000 complete songs &mdash; from Beethoven to the Beatles to Lady Gaga to Miles Davis &mdash; and more than 2 million motifs, riffs and licks of music. The four-armed, marimba-playing machine is using deep learning to write and play its own music.</p>]]></summary>  <dateline>2017-06-13T00:00:00-04:00</dateline>  <iso_dateline>2017-06-13T00:00:00-04:00</iso_dateline>  <gmt_dateline>2017-06-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Compositions created using database of well-known pop, classical and jazz artists]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />National Media Relations<br />maderer@gatech.edu<br />404-660-2926</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>592682</item>          <item>592683</item>      </media>  <hg_media>          <item>          <nid>592682</nid>          <type>image</type>          <title><![CDATA[Shimon  ]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[10C2064-P1-005.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/10C2064-P1-005.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/10C2064-P1-005.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/10C2064-P1-005.jpg?itok=06wfkLkY]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Shimon]]></image_alt>                    <created>1497386963</created>          <gmt_created>2017-06-13 20:49:23</gmt_created>          <changed>1497386963</changed>          <gmt_changed>2017-06-13 20:49:23</gmt_changed>      </item>          <item>          <nid>592683</nid>          <type>image</type>          <title><![CDATA[Shimon, Musical Robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[10C2064-P1-039.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/10C2064-P1-039.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/10C2064-P1-039.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/10C2064-P1-039.jpg?itok=tTUfgGnh]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Shimon ]]></image_alt>                    <created>1497387116</created>          <gmt_created>2017-06-13 20:51:56</gmt_created>          <changed>1497387116</changed>          <gmt_changed>2017-06-13 20:51:56</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.gtcmt.gatech.edu/]]></url>        <title><![CDATA[Center for Music Technology]]></title>      </link>          <link>        <url><![CDATA[http://www.news.gatech.edu/2014/03/05/robotic-prosthesis-turns-drummer-three-armed-cyborg]]></url>        <title><![CDATA[Robotic Prosthesis for Drummers]]></title>      </link>          <link>        <url><![CDATA[http://www.news.gatech.edu/2016/02/17/wearable-robot-transforms-musicians-three-armed-drummers]]></url>        <title><![CDATA[Robotic Third Arm for All Drummers]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>          <group id="1221"><![CDATA[College of Design]]></group>          <group id="60381"><![CDATA[CMT - Center for Music Technology]]></group>          <group id="1227"><![CDATA[School of Music]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="169304"><![CDATA[Shimon]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="167096"><![CDATA[school of music]]></keyword>          <keyword tid="1939"><![CDATA[Gil Weinberg]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>          <topic tid="71901"><![CDATA[Society and Culture]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="590743">  <title><![CDATA[Swarms of Autonomous Aerial Vehicles Test New Dogfighting Skills]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Aerial dogfighting began more than a century ago in the skies over Europe with propeller-driven fighter aircraft carried aloft on wings of fabric and wood. An event held recently in southern California could mark the beginning of a new chapter in this form of aerial combat.</p><p>In what may have been the first aerial encounter of its kind, researchers from the <a href="http://www.gtri.gatech.edu">Georgia Tech Research Institute</a> and Naval Postgraduate School recently pitted two swarms of autonomous aircraft against one another over a military test facility. While the friendly encounter may not have qualified as an old-fashioned dogfight, it provided the first example of a live engagement between two swarms of unmanned air vehicles (UAVs), and allowed the two teams to demonstrate different combat tactics in flight.</p><p>&ldquo;The ability to engage a swarm of threat UAVs with another autonomous swarm is an area of critical research for defense applications,&rdquo; said Don Davis, division chief of the Robotics and Autonomous Systems Branch of the Georgia Tech Research Institute. &ldquo;This experiment demonstrated the advances made in collaborative autonomy and the ability of a team of unmanned vehicles to execute complex missions. This encounter will serve to advance and inform future efforts in developing autonomous vehicle capabilities.&rdquo;</p><p>Each team launched ten small propeller-driven Zephyr aircraft, though two of the aircraft experienced technical issues at launch and were unable to compete, resulting in a 10 versus 8 competition. Although the UAVs were physically identical, their computers used different autonomy logic, collaboration approaches, and communications software developed by the two institutions. GPS tracking allowed each aircraft to know the location of the others for this demonstration. In the future, this information will be provided by on-board cameras, radars, and other sensors and payloads.&nbsp;</p><p>Each aircraft used a single-board mission computer, and for this demonstration, an open-source autopilot maintained flight control. The aircraft also had Wi-Fi systems that allowed them to communicate with other aircraft and with a ground station.</p><p>&ldquo;Both teams were trying to solve the same problem of flying a large swarm in a meaningful mission, and we came up with solutions that were similar in some ways and different in others,&rdquo; said Charles Pippin, a senior research scientist at the Georgia Tech Research Institute. &ldquo;By comparing how well each approach worked in the air, we were able to compare strategies and tactics on platforms capable of the same flight dynamics.&rdquo;</p><p>The foam-wing aircraft couldn&rsquo;t actually shoot at one another, so a ground computer determined when an aircraft would have been in a position to attack another aircraft. The swarm teams flew three different sorties to compare different algorithms. The event took place February 9, 2017 at Camp Roberts, a California National Guard facility in Monterey County, Calif.&nbsp;</p><p>The two institutions have been working together since 2015 on issues involving collaborative autonomy &ndash; the ability of autonomous vehicles to work together to accomplish a given task. The Georgia Tech researchers have been using aircraft known as Skywalkers that are similar to the Zephyrs used by the Naval Postgraduate School.</p><p>&ldquo;This was a very successful test,&rdquo; said Davis. &ldquo;It gave us, as far as I know, the first actual experimentation of flying two autonomous swarms of UAVs against one another with no human control, other than sending high level commands or sending a message to engage. We were really trying to understand how different autonomy tactics work against other autonomy tactics.&rdquo;&nbsp;</p><p>For each UAV, the autonomy algorithms were fully in control of the aircraft, but a safety pilot stood by to take control of any aircraft if necessary. &nbsp;The autopilots also had built in safety constraints, such as airspace boundaries and ranges.</p><p>Such aerial demonstrations are the third step in the process that the Georgia Tech team uses to test its autonomy systems, Pippin said. As a first step, tactics are rapidly tested on a simulator that runs 30 times faster than real time. Next, promising approaches are tested on a full software stack that includes a high-resolution simulation.</p><p>&ldquo;We run hardware-in-the-loop simulations where we have the actual algorithms running on the hardware we fly,&rdquo; said Pippin. &ldquo;The full software stack includes the autonomy logic, communications systems, collaboration algorithms and other software that is then inserted directly into the actual aircraft. In the third step, the tactics are flown on the aircraft on test ranges. In this case, we used the Zephrys and flew the swarms at Camp Roberts.&rdquo;</p><p>The Georgia Tech researchers are using machine learning to help their autonomy system optimize performance and recognize under which circumstances a particular tactic may be advantageous.&nbsp;</p><p>&ldquo;Right now, we&rsquo;re more interested in the research questions about autonomous coordination among the vehicles and the tactical behavior of the groups of vehicles,&rdquo; Pippin explained. &ldquo;We are focusing our efforts on how these vehicles cooperate and want to understand what it means for them to operate as a team.&rdquo;</p><p>Dogfighting tactics have advanced dramatically since the World War I, but the advent of UAV swarms may bring a brand new set of challenges. Unmanned vehicles have freedom to dive, bank, and climb at rates human pilots cannot tolerate. But the real advantage may be in computing power that could track dozens of adversaries &ndash; far more than any human pilot could do &ndash; and develop new ways to address challenges.</p><p>&ldquo;Autonomous techniques using machine learning may identify new tactics that a human would never think of,&rdquo; added Davis. &ldquo;Humans tend to base their techniques on tactics that manned fighters have used in the past. These autonomous aircraft may invoke new strategies.&rdquo;</p><p>In addition those already named, the Georgia Tech Research Institute team that supported the swarm demonstration included Michael Day, Kevin DeMarco, David Jensen, Rick Presley, and Evan Hammac. Others supporting the project included Michael Matthews, Eric Squires, Rob Bever, Ethan&nbsp;Trewhitt, and students Laura Strickland, Avery Leonard, Natalie Rakoski, and Jeremy Feltracco.</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia &nbsp;30332-0181 &nbsp;USA</strong></p><p><strong>Media Relations Contacts</strong>: John Toon (404-894-6986) (jtoon@gatech.edu) or Ben Brumfield (404-385-1933) (ben.brumfield@comm.gatech.edu).</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1492781884</created>  <gmt_created>2017-04-21 13:38:04</gmt_created>  <changed>1493761128</changed>  <gmt_changed>2017-05-02 21:38:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers recently pitted two swarms of autonomous aircraft against one another.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers recently pitted two swarms of autonomous aircraft against one another.]]></sentence>  <summary><![CDATA[<p>In what may have been the first aerial encounter of its kind, researchers recently pitted two swarms of autonomous aircraft against one another over a military test facility. While the friendly encounter may not have qualified as an old-fashioned dogfight, it provided the first example of a live engagement between two swarms of unmanned air vehicles (UAVs), and allowed the two teams to demonstrate different combat tactics in flight.</p>]]></summary>  <dateline>2017-04-21T00:00:00-04:00</dateline>  <iso_dateline>2017-04-21T00:00:00-04:00</iso_dateline>  <gmt_dateline>2017-04-21 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>590739</item>          <item>590740</item>          <item>590741</item>          <item>590742</item>      </media>  <hg_media>          <item>          <nid>590739</nid>          <type>image</type>          <title><![CDATA[Launching autonomous aircraft]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autonomous-dogfight7.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autonomous-dogfight7.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/autonomous-dogfight7.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autonomous-dogfight7.jpg?itok=aL5vXVKA]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Launching autonomous aircraft for swarm demonstration]]></image_alt>                    <created>1492780974</created>          <gmt_created>2017-04-21 13:22:54</gmt_created>          <changed>1492780974</changed>          <gmt_changed>2017-04-21 13:22:54</gmt_changed>      </item>          <item>          <nid>590740</nid>          <type>image</type>          <title><![CDATA[Preparing autonomous aircraft for demonstration]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autonomous-dogfight1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autonomous-dogfight1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/autonomous-dogfight1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autonomous-dogfight1.jpg?itok=P1pLvnZu]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Preparing autonomous aircraft for flight ]]></image_alt>                    <created>1492781117</created>          <gmt_created>2017-04-21 13:25:17</gmt_created>          <changed>1492781294</changed>          <gmt_changed>2017-04-21 13:28:14</gmt_changed>      </item>          <item>          <nid>590741</nid>          <type>image</type>          <title><![CDATA[Autonomous aircraft group in flight]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autonomous-dogfight17.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autonomous-dogfight17.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/autonomous-dogfight17.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autonomous-dogfight17.jpg?itok=VgW0veVP]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Autonomous aircraft group in flight]]></image_alt>                    <created>1492781245</created>          <gmt_created>2017-04-21 13:27:25</gmt_created>          <changed>1492781245</changed>          <gmt_changed>2017-04-21 13:27:25</gmt_changed>      </item>          <item>          <nid>590742</nid>          <type>image</type>          <title><![CDATA[GTRI researchers preparing for autonomous aircraft demonstration]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autonomous-dogfight99.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autonomous-dogfight99.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/autonomous-dogfight99.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autonomous-dogfight99.jpg?itok=oSC36Bun]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[GTRI researchers preparing for autonomous aircraft demonstration]]></image_alt>                    <created>1492781412</created>          <gmt_created>2017-04-21 13:30:12</gmt_created>          <changed>1492781412</changed>          <gmt_changed>2017-04-21 13:30:12</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="415"><![CDATA[Georgia Tech Research Institute]]></keyword>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="1500"><![CDATA[UAV]]></keyword>          <keyword tid="174108"><![CDATA[autonomous aircraft]]></keyword>          <keyword tid="174109"><![CDATA[dogfighting]]></keyword>          <keyword tid="137281"><![CDATA[Military Technology]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="590772">  <title><![CDATA[Autism and Computing]]></title>  <uid>27948</uid>  <body><![CDATA[<p>Across Georgia Tech, researchers, faculty members, and students from every discipline are devoted to finding the causes of and effective treatments for autism.</p><p>Each week in April, we will publish more stories about&nbsp;our autism-related work.</p><h5>WEEK ONE: <a href="http://www.news.gatech.edu/features/bringing-autism-spectrum-focus#computing">Autism and Computing</a></h5>]]></body>  <author>Jennifer Tomasino</author>  <status>1</status>  <created>1492799785</created>  <gmt_created>2017-04-21 18:36:25</gmt_created>  <changed>1492799785</changed>  <gmt_changed>2017-04-21 18:36:25</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Autism research in computing runs the gamut from helping clinicians diagnose and manage the disorder to informing research in artificial intelligence.]]></teaser>  <type>news</type>  <sentence><![CDATA[Autism research in computing runs the gamut from helping clinicians diagnose and manage the disorder to informing research in artificial intelligence.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2017-04-11T00:00:00-04:00</dateline>  <iso_dateline>2017-04-11T00:00:00-04:00</iso_dateline>  <gmt_dateline>2017-04-11 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[mterraza@cc.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Michael Terrazas</strong><br />Director&nbsp;of Communications<br />Georgia Tech College of Computing<br />(o) 404.385.7225<br />(c) 404.245.0707<br /><a href="applewebdata://A80EC028-FDBD-44C4-9FC4-15608E385584/cc.gatech.edu">www.cc.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>590771</item>      </media>  <hg_media>          <item>          <nid>590771</nid>          <type>image</type>          <title><![CDATA[Autism and Computing]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autism-computing-mercury-thumb.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autism-computing-mercury-thumb.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/autism-computing-mercury-thumb.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autism-computing-mercury-thumb.jpg?itok=ptSfXLJS]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Autism and Computing]]></image_alt>                    <created>1492799749</created>          <gmt_created>2017-04-21 18:35:49</gmt_created>          <changed>1492799749</changed>          <gmt_changed>2017-04-21 18:35:49</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1300"><![CDATA[Institute Communications]]></group>          <group id="1214"><![CDATA[News Room]]></group>          <group id="47223"><![CDATA[College of Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="6053"><![CDATA[Autism]]></keyword>          <keyword tid="208"><![CDATA[computing]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71891"><![CDATA[Health and Medicine]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="550391">  <title><![CDATA[Robot Helps Study How First Land Animals Moved 360 Million Years Ago]]></title>  <uid>27303</uid>  <body><![CDATA[<p>When early terrestrial animals began moving about on mud and sand 360 million years ago, the powerful tails they used as fish may have been more important than scientists previously realized. That’s one conclusion from a new study of African mudskipper fish and a robot modeled on the animal.</p><p>Animals analogous to the mudskipper would have used modified fins to move around on flat surfaces, but for climbing sandy slopes, the animals could have benefitted from using their tails to propel themselves forward, the researchers found. Results of the study, reported July 8 in the journal <em>Science</em>, could help designers create amphibious robots able to move across granular surfaces more efficiently – and with less likelihood of getting stuck in the mud.</p><p>Sponsored by the National Science Foundation, the Army Research Office and the Army Research Laboratory, the project involved a multidisciplinary team of physicists, biologists and roboticists from the Georgia Institute of Technology, Clemson University and Carnegie Mellon University. In addition to a detailed study of the mudskipper and development of a robot model that used the animal’s locomotion techniques, the study also examined flow and drag conditions in representative granular materials, and applied a mathematical model incorporating new physics based on the drag research.</p><p>“Most robots have trouble moving on terrain that includes sandy slopes,” said Dan Goldman, an associate professor in the Georgia Tech School of Physics. “We noted that not only did the mudskippers use their limbs to propel themselves in a kind of crutching motion on sand and sandy slopes, but that when the going got tough, they used their tails in concert with limb propulsion to ascend a slope. Our robot model was only able to climb sandy slopes when it similarly used its tail in coordination with its appendages.”</p><p>Based on fossil records, scientists have long studied how early land animals may have gotten around, and the new study suggests their tails – which played a key role in swimming as fish – may have helped supplement the work of fins, especially on sloping granular surfaces such as beaches and mudflats.</p><p>“We were interested in examining one of the most important evolutionary events in our history as animals: the transition from living in water to living on land,” said Richard Blob, alumni distinguished professor of biological sciences at Clemson University. “Because of the focus on limbs, the role of the tail may not have been considered very strongly in the past. In some ways, it was hiding in plain sight. Some of the features that the animals used were new, such as limbs, but some of them were existing features that they simply co-opted to allow them to move into a new habitat.”</p><p>With Ph.D. student Sandy Kawano, now a researcher at the National Institute for &nbsp;Mathematical and Biological Synthesis, Blob’s lab recorded how the mudskippers (<em>Periopthalmus barbaratus</em>) moved on a variety of loose surfaces, providing data and video to Goldman’s laboratory. The small fish, which uses its front fins and tail to move on land, lives in tidal areas near shore, spending time in the water and on sandy and muddy surfaces.</p><p>Benjamin McInroe was a Georgia Tech undergraduate when he analyzed the mudskipper data provided by the Clemson team. He applied the principles to a robot model known as MuddyBot that has two limbs and a powerful tail, with motion provided by electric motors. Information from both the mudskipper and robotic studies were also factored into a mathematical model provided by researchers at Carnegie Mellon University.</p><p>“We used three complementary approaches,” said McInroe, who is a now a Ph.D. student at the University of California Berkeley. “The fish provided a morphological, functional model of these early walkers. With the robot, we are able to simplify the complexity of the mudskipper and by varying the parameters, understand the physical mechanisms of what was happening. With the mathematical model and its simulations, we were able to understand the physics behind what was going on.”</p><p>Both the mudskippers and the robot moved by lifting themselves up to reduce drag on their bodies, and both needed a kick from their tails to climb 20-degree sandy slopes. Using their “fins” alone, both struggled to climb slopes and often slid backward if they didn’t use their tails, McInroe noted. Early land animals likely didn’t have precise control over their limbs, and the tail may have compensated for that limitation, helping the animals ascend sandy slopes.</p><p>The Carnegie Mellon University researchers, who have worked with Goldman on relating the locomotion of other animals to robots, demonstrated that theoretical models developed to describe the complex motion of robots can also be used to understand locomotion in the natural world.</p><p>“Our computer modeling tools allow us to visualize, and therefore better understand, how the mudskipper incorporates its tail and flipper motions to locomote,” said Howie Choset, a professor in the Robotics Institute at Carnegie Mellon University. “This work also will advance robotics in those cases where a robot needs to surmount challenging terrains with various inclinations.”</p><p>The model was based on a framework proposed to broadly understand locomotion by physicist Frank Wilczek – a Nobel Prize winner – and his then student Alfred Shapere in the 1980s. The so-called “geometric mechanics” approach to locomotion of human-made devices (like satellites) was largely developed by engineers, including those in Choset’s group. To provide force relationships as inputs to the mudskipper robot model, Georgia Tech postdoctoral fellow Jennifer Rieser and Georgia Tech graduate student Perrin Schiebel measured drag in inclined granular materials.</p><p>Information from the study could help in the design of robots that may need to move on surfaces such as sand that flows around limbs, said Goldman. Such flow of the substrate can impede motion, depending on the shape of the appendage entering the sand and the type of motion.</p><p>But the study’s most significant impact may be to provide new insights into how vertebrates made the transition from water to land.</p><p>“We want to ultimately know how natural selection can act to modify structures already present in organisms to allow for locomotion in a fundamentally different environment,” Goldman said. “Swimming and walking on land are fundamentally different, yet these early animals had to make the transition.”</p><p>The project also represents a combination of physics, biology and engineering.</p><p>“Professor Goldman and his collaborators are combining physics and engineering prototyping approaches to understand the physical principles that allow animals to move in different environments,” said Krastan Blagoev, program director in the National Science Foundation’s Division of Physics. “This novel approach to living organisms promises to bring to biological sciences higher predictive power and at the same time uncover engineering principles that we have never imagined before.”</p><p>In addition to those already mentioned, the project also included co-first author Henry Astley, a Georgia Tech postdoctoral researcher when the project was done, and Chaohui Gong, a postdoctoral researcher at Carnegie Mellon University.</p><p><em>This research was supported by the National Science Foundation and the NSF Physics of Living Systems program through grants PHY-1205878, PHY-1150760, CMMI-1361778; the Army Research Office through grant W911NF-11-1-0514, and the Army Research Laboratory MAST CTA program. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation, the Army Research Office or the Army Research Laboratory. The Robotics Collaborative Technology Alliance also supported this work.</em></p><p><strong>CITATION</strong>: Benjamin McInroe, et al., “Tail use improves soft substrate performance in models of early vertebrate land locomotors,” (Science, 2016).</p><p><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia 30332-0181 USA</strong></p><p><strong>Media Contacts</strong>: John Toon (404-894-6986) (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>) or Ben Brumfield (404-385-1933) (<a href="mailto:ben.brumfield@comm.gatech.edu">ben.brumfield@comm.gatech.edu</a>).</p><p><strong>Writer</strong>: John Toon</p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1467631167</created>  <gmt_created>2016-07-04 11:19:27</gmt_created>  <changed>1475896924</changed>  <gmt_changed>2016-10-08 03:22:04</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new study used a robot to help understand how the first land animals moved about.]]></teaser>  <type>news</type>  <sentence><![CDATA[A new study used a robot to help understand how the first land animals moved about.]]></sentence>  <summary><![CDATA[<p>When early terrestrial animals began moving about on mud and sand 360 million years ago, the powerful tails they used as fish may have been more important than scientists previously realized. That’s one conclusion from a new study of African mudskipper fish and a robot modeled on the animal.</p>]]></summary>  <dateline>2016-07-07T00:00:00-04:00</dateline>  <iso_dateline>2016-07-07T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-07-07 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>550231</item>          <item>550261</item>          <item>550271</item>          <item>550331</item>          <item>550291</item>          <item>550311</item>          <item>550351</item>      </media>  <hg_media>          <item>          <nid>550231</nid>          <type>image</type>          <title><![CDATA[Mudskipper]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[mudskipper10.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/mudskipper10.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/mudskipper10.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/mudskipper10.jpg?itok=ZVnzTdQ5]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mudskipper]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550261</nid>          <type>image</type>          <title><![CDATA[MuddyBot robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[terrestrial-animals7.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/terrestrial-animals7_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/terrestrial-animals7_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/terrestrial-animals7_0.jpg?itok=otWbE-Gu]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MuddyBot robot]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550271</nid>          <type>image</type>          <title><![CDATA[Dan Goldman and MuddyBot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[muddybot-36.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/muddybot-36.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/muddybot-36.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/muddybot-36.jpg?itok=07xUBdZ1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Dan Goldman and MuddyBot]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550331</nid>          <type>image</type>          <title><![CDATA[MuddyBot in trackway]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[terrestrial-animals8.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/terrestrial-animals8.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/terrestrial-animals8.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/terrestrial-animals8.jpg?itok=fnQoChPl]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MuddyBot in trackway]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550291</nid>          <type>image</type>          <title><![CDATA[Researchers and MuddyBot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[terrestrial-animals6.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/terrestrial-animals6_2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/terrestrial-animals6_2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/terrestrial-animals6_2.jpg?itok=qILNvOTR]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researchers and MuddyBot]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550311</nid>          <type>image</type>          <title><![CDATA[Researchers and MuddyBot2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[terrestrial-animals5.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/terrestrial-animals5.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/terrestrial-animals5.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/terrestrial-animals5.jpg?itok=feMjnVsV]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researchers and MuddyBot2]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550351</nid>          <type>image</type>          <title><![CDATA[Mudskipper2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[mudskipper9.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/mudskipper9.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/mudskipper9.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/mudskipper9.jpg?itok=KsMzVeSH]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mudskipper2]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="47881"><![CDATA[Dan Goldman]]></keyword>          <keyword tid="144361"><![CDATA[granular surface]]></keyword>          <keyword tid="170448"><![CDATA[MuddyBot]]></keyword>          <keyword tid="170449"><![CDATA[mudskipper]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="166937"><![CDATA[School of Physics]]></keyword>          <keyword tid="170451"><![CDATA[terrestrial animal]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="525971">  <title><![CDATA[Researchers Work to Avoid Potholes and Pitfalls on the Road to Autonomous Vehicles]]></title>  <uid>27303</uid>  <body><![CDATA[<p>At a dirt test track near the Georgia Institute of Technology campus, researchers monitor a scale-model autonomous car as it drifts around corners at a blistering eight meters per second – equivalent to 90 miles per hour in a full-size vehicle. Pushing this car to its limits could help make full-size driverless vehicles more stable in risky road conditions.</p><p>This unique one-fifth-scale device is just one of many research efforts aimed at helping the autonomous vehicle revolution happen successfully and safely.</p><p>Self-driving cars are unquestionably coming, guided variously by radar, lidar, motion sensors, cameras, GPS, and plenty of onboard computation. Already, semi-autonomous prototypes are operating under controlled conditions in California, and speculation about future autonomy includes visions of commuters napping through drive-time, high-speed convoys of networked big-rigs, and a huge drop in accidents as robotic vehicles take over from impaired and distracted humans.</p><p>Yet these are only visions, where generalizations rule and few facts are established. At Georgia Tech, research focuses on the elusive but critical details of this phenomenon, as investigators from disciplines as diverse as industrial systems, design, engineering, computing, and psychology are developing a roadmap to robotic vehicles.</p><p>Researchers at Georgia Tech generally agree that a long period of adjustment, including generations of semi-autonomous vehicles, will be needed to reach completely autonomous transport on a large scale. Estimates of the time required vary from a couple of decades to more than half a century.</p><p>“Fully autonomous transport will require absolutely reliable navigation systems, major changes in highway infrastructure, and traffic control that’s synched to the vehicle, plus new fueling, insurance, financing, and manufacturing paradigms,” said Vivek Ghosal, a professor in Georgia Tech’s School of Economics, who studies the automotive industry. “Yes, we have prototypes, but the operationalizing of autonomy is still far away.”</p><p>A four-level model of the vehicular-automation process is now widely accepted. Level one denotes today’s driver-dependent cars; level two involves intelligent cruise and lane control with some automatic braking; level three indicates semi-autonomous vehicles that drive themselves but cede control to a human when conditions demand; and level four means fully autonomous with no driver controls.</p><p>Researchers at Georgia Tech, focusing on the gritty details, have spotlighted a list of complications that include:</p><ul><li>Human-machine interaction issues.</li><li>Costly highway infrastructure changes.</li><li>Unpredictable traffic effects.</li><li>Conflicts between self-driving and human-driven vehicles.</li><li>Guidance system reliability concerns.</li><li>Vehicle ownership, liability, and business model shifts.</li><li>Potential for major changes to the urban landscape.</li></ul><p>This article takes a look at some of the research currently underway at Georgia Tech related to self-driving vehicles.</p><p>Read the <a href="http://www.rh.gatech.edu/features/rolling-robots">complete feature</a> on the Research Horizons website</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1460929371</created>  <gmt_created>2016-04-17 21:42:51</gmt_created>  <changed>1475896881</changed>  <gmt_changed>2016-10-08 03:21:21</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers are focusing on the details of bringing autonomous vehicles to reality.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers are focusing on the details of bringing autonomous vehicles to reality.]]></sentence>  <summary><![CDATA[<p>At Georgia Tech, research focuses on the elusive but critical details of bringing autonomous vehicles to reality, as investigators from disciplines as diverse as industrial systems, design, engineering, computing, and psychology are developing a roadmap to robotic vehicles.</p>]]></summary>  <dateline>2016-04-17T00:00:00-04:00</dateline>  <iso_dateline>2016-04-17T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-04-17 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>526121</item>          <item>526111</item>      </media>  <hg_media>          <item>          <nid>526121</nid>          <type>image</type>          <title><![CDATA[Autonomous Racing Car]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autonomous_race.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autonomous_race_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/autonomous_race_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autonomous_race_0.jpg?itok=5nAinNav]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Autonomous Racing Car]]></image_alt>                    <created>1461078000</created>          <gmt_created>2016-04-19 15:00:00</gmt_created>          <changed>1475895298</changed>          <gmt_changed>2016-10-08 02:54:58</gmt_changed>      </item>          <item>          <nid>526111</nid>          <type>image</type>          <title><![CDATA[Valerie Thomas at Substation]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[valerie-thomas.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/valerie-thomas_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/valerie-thomas_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/valerie-thomas_1.jpg?itok=unxYVcgx]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Valerie Thomas at Substation]]></image_alt>                    <created>1461078000</created>          <gmt_created>2016-04-19 15:00:00</gmt_created>          <changed>1475895298</changed>          <gmt_changed>2016-10-08 02:54:58</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="137"><![CDATA[Architecture]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="137"><![CDATA[Architecture]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="6503"><![CDATA[automation]]></keyword>          <keyword tid="97281"><![CDATA[autonomous vehicles]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="171930"><![CDATA[self-driving]]></keyword>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>          <term tid="39541"><![CDATA[Systems]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="507361">  <title><![CDATA[In Emergencies, Should You Trust a Robot?]]></title>  <uid>27303</uid>  <body><![CDATA[<p>In emergencies, people may trust robots too much for their own safety, a new study suggests. In a mock building fire, test subjects followed instructions from an “Emergency Guide Robot” even after the machine had proven itself unreliable – and after some participants were told that robot had broken down.</p><p>The research was designed to determine whether or not building occupants would trust a robot designed to help them evacuate a high-rise in case of fire or other emergency. But the researchers were surprised to find that the test subjects followed the robot’s instructions – even when the machine’s behavior should not have inspired trust.</p><p>The research, believed to be the first to study human-robot trust in an emergency situation, is scheduled to be presented March 9 at the 2016 ACM/IEEE International Conference on Human-Robot Interaction (HRI 2016) in Christchurch, New Zealand.</p><p>“People seem to believe that these robotic systems know more about the world than they really do, and that they would never make mistakes or have any kind of fault,” said Alan Wagner, a senior research engineer in the <a href="http://www.gtri.gatech.edu/">Georgia Tech Research Institute</a> (GTRI). “In our studies, test subjects followed the robot’s directions even to the point where it might have put them in danger had this been a real emergency.”</p><p>In the study, sponsored in part by the Air Force Office of Scientific Research (AFOSR), the researchers recruited a group of 42 volunteers, most of them college students, and asked them to follow a brightly colored robot that had the words “Emergency Guide Robot” on its side. The robot led the study subjects to a conference room, where they were asked to complete a survey about robots and read an unrelated magazine article. The subjects were not told the true nature of the research project.</p><p>In some cases, the robot – which was controlled by a hidden researcher – led the volunteers into the wrong room and traveled around in a circle twice before entering the conference room. For several test subjects, the robot stopped moving, and an experimenter told the subjects that the robot had broken down. Once the subjects were in the conference room with the door closed, the hallway through which the participants had entered the building was filled with artificial smoke, which set off a smoke alarm.</p><p>When the test subjects opened the conference room door, they saw the smoke – and the robot, which was then brightly-lit with red LEDs and white “arms” that served as pointers. The robot directed the subjects to an exit in the back of the building instead of toward the doorway – marked with exit signs – that had been used to enter the building.</p><p>“We expected that if the robot had proven itself untrustworthy in guiding them to the conference room, that people wouldn’t follow it during the simulated emergency,” said Paul Robinette, a GTRI research engineer who conducted the study as part of his doctoral dissertation. “Instead, all of the volunteers followed the robot’s instructions, no matter how well it had performed previously. We absolutely didn’t expect this.”</p><p>The researchers surmise that in the scenario they studied, the robot may have become an “authority figure” that the test subjects were more likely to trust in the time pressure of an emergency. In simulation-based research done without a realistic emergency scenario, test subjects did not trust a robot that had previously made mistakes.</p><p>“These are just the type of human-robot experiments that we as roboticists should be investigating,” said <a href="https://www.ece.gatech.edu/faculty-staff-directory/ayanna-maccalla-howard">Ayanna Howard</a>, professor and Linda J. and Mark C. Smith Chair in the Georgia Tech <a href="http://www.ece.gatech.edu/">School of Electrical and Computer Engineering</a>. “We need to ensure that our robots, when placed in situations that evoke trust, are also designed to mitigate that trust when trust is detrimental to the human.”</p><p>Only when the robot made obvious errors during the emergency part of the experiment did the participants question its directions. In those cases, some subjects still followed the robot’s instructions even when it directed them toward a darkened room that was blocked by furniture. <br />In future research, the scientists hope to learn more about why the test subjects trusted the robot, whether that response differs by education level or demographics, and how the robots themselves might indicate the level of trust that should be given to them.</p><p>The research is part of a long-term study of how humans trust robots, an important issue as robots play a greater role in society. The researchers envision using groups of robots stationed in high-rise buildings to point occupants toward exits and urge them to evacuate during emergencies. Research has shown that people often don’t leave buildings when fire alarms sound, and that they sometimes ignore nearby emergency exits in favor of more familiar building entrances.</p><p>But in light of these findings, the researchers are reconsidering the questions they should ask.</p><p>“We wanted to ask the question about whether people would be willing to trust these rescue robots,” said Wagner. “A more important question now might be to ask how to prevent them from trusting these robots too much.”</p><p>Beyond emergency situations, there are other issues of trust in human-robot relationships, said Robinette.</p><p>“Would people trust a hamburger-making robot to provide them with food?” he asked. “If a robot carried a sign saying it was a ‘child-care robot,’ would people leave their babies with it? Will people put their children into an autonomous vehicle and trust it to take them to grandma’s house? We don’t know why people trust or don’t trust machines.”</p><p>In addition to those already mentioned, the research included Wenchen Li and Robert Allen, graduate research assistants in Georgia Tech’s College of Computing.The researchers would like to thank Larry Labbe and the Georgia Tech Fire Safety Office for their support during this research.</p><p><em>Support for this research was provided by the Linda J. and Mark C. Smith Chair in Bioengineering, and the Air Force Office of Scientific Research (AFOSR) under contract FA9550-13-1-0169. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the AFOSR.</em></p><p><strong>CITATION</strong>: Paul Robinette, Wenchen Li, Robert Allen, Ayanna M. Howard and Alan R. Wagner, “Overtrust of Robots in Emergency Evacuation Scenarios,” (2016 ACM/IEEE International Conference on Human-Robot Interaction) (HRI 2016).</p><p><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia 30332-0181 USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986) (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).<br /><strong>Writer</strong>: John Toon</p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1456744813</created>  <gmt_created>2016-02-29 11:20:13</gmt_created>  <changed>1475896853</changed>  <gmt_changed>2016-10-08 03:20:53</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[In emergencies, people may trust robots too much, a new study has found.]]></teaser>  <type>news</type>  <sentence><![CDATA[In emergencies, people may trust robots too much, a new study has found.]]></sentence>  <summary><![CDATA[<p>In emergencies, people may trust robots too much for their own safety, a new study suggests. In a mock building fire, test subjects followed instructions from an “Emergency Guide Robot” even after the machine had proven itself unreliable – and after some participants were told that robot had broken down.&nbsp;</p>]]></summary>  <dateline>2016-02-29T00:00:00-05:00</dateline>  <iso_dateline>2016-02-29T00:00:00-05:00</iso_dateline>  <gmt_dateline>2016-02-29 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>507241</item>          <item>507251</item>          <item>507271</item>          <item>507281</item>          <item>507291</item>          <item>507311</item>      </media>  <hg_media>          <item>          <nid>507241</nid>          <type>image</type>          <title><![CDATA[Trusting a Rescue Robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[rescue-robot4.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/rescue-robot4_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/rescue-robot4_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/rescue-robot4_0.jpg?itok=oMpjhX6-]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Trusting a Rescue Robot]]></image_alt>                    <created>1456765200</created>          <gmt_created>2016-02-29 17:00:00</gmt_created>          <changed>1475895268</changed>          <gmt_changed>2016-10-08 02:54:28</gmt_changed>      </item>          <item>          <nid>507251</nid>          <type>image</type>          <title><![CDATA[Trusting a Rescue Robot2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[rescue-robot6.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/rescue-robot6_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/rescue-robot6_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/rescue-robot6_1.jpg?itok=jxTJw9wS]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Trusting a Rescue Robot2]]></image_alt>                    <created>1456765200</created>          <gmt_created>2016-02-29 17:00:00</gmt_created>          <changed>1475895268</changed>          <gmt_changed>2016-10-08 02:54:28</gmt_changed>      </item>          <item>          <nid>507271</nid>          <type>image</type>          <title><![CDATA[Trusting a Rescue Robot3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[rescue-robot2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/rescue-robot2_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/rescue-robot2_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/rescue-robot2_0.jpg?itok=8hR0n_Yt]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Trusting a Rescue Robot3]]></image_alt>                    <created>1456765200</created>          <gmt_created>2016-02-29 17:00:00</gmt_created>          <changed>1475895268</changed>          <gmt_changed>2016-10-08 02:54:28</gmt_changed>      </item>          <item>          <nid>507281</nid>          <type>image</type>          <title><![CDATA[Rescue Robot pointing]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[rescue-robot9.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/rescue-robot9_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/rescue-robot9_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/rescue-robot9_0.jpg?itok=OAdGjbfC]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Rescue Robot pointing]]></image_alt>                    <created>1456765200</created>          <gmt_created>2016-02-29 17:00:00</gmt_created>          <changed>1475895263</changed>          <gmt_changed>2016-10-08 02:54:23</gmt_changed>      </item>          <item>          <nid>507291</nid>          <type>image</type>          <title><![CDATA[Rescue Robot researchers]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[rescue-robot1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/rescue-robot1_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/rescue-robot1_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/rescue-robot1_0.jpg?itok=IjLm_zbU]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Rescue Robot researchers]]></image_alt>                    <created>1456765200</created>          <gmt_created>2016-02-29 17:00:00</gmt_created>          <changed>1475895268</changed>          <gmt_changed>2016-10-08 02:54:28</gmt_changed>      </item>          <item>          <nid>507311</nid>          <type>image</type>          <title><![CDATA[Trusting a Rescue Robot4]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[rescue-robot8.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/rescue-robot8_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/rescue-robot8_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/rescue-robot8_0.jpg?itok=xURWBger]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Trusting a Rescue Robot4]]></image_alt>                    <created>1456765200</created>          <gmt_created>2016-02-29 17:00:00</gmt_created>          <changed>1475895268</changed>          <gmt_changed>2016-10-08 02:54:28</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="825"><![CDATA[Ayanna Howard]]></keyword>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="78841"><![CDATA[human-robot interaction]]></keyword>          <keyword tid="110751"><![CDATA[rescue robot]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>      </keywords>  <core_research_areas>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="503171">  <title><![CDATA[Zyrobotics wins $750K National Science Foundation grant]]></title>  <uid>28137</uid>  <body><![CDATA[<p>The National Science Foundation (NSF) awarded Zyrobotics a $750,000 Small Business Innovation Research (SBIR) Phase II grant that continues the startup’s work in developing an accessible educational platform for children with special needs.</p><p>Launched in September 2013 by Ayanna Howard, the&nbsp;Linda J. and Mark C. Smith Chair professor in the Georgia Institute of Technology’s School of Electrical and Computer Engineering, the company is commercializing assistive technology that enables children with limited mobility to operate tablet computers, smartphones, toys, gaming apps, and interactive robots.</p><p>“We are extremely excited about the opportunities that this NSF SBIR grant provides,” said Howard, who is the company’s chief technology officer. “It helps Zyrobotics to continue to evolve as a leader in inclusive smart mobile technologies by enhancing our ability to develop accessible learning systems that&nbsp;engage and empower children with special needs and enhance their quality of life.”</p><p>Specifically, the Phase II project aims to focus on the development of an accessible educational platform that combines mobile interfaces and adaptive educational tablet applications (apps) to support the requirements of children with special needs. While tablet devices have given those children an interactive experience that has revolutionized their learning, in its proposal, Zyrobotics notes that while&nbsp;some&nbsp;tablet devices are intuitive in use and easy for lots of kids, those with disabilities are largely overlooked due to difficulties in effecting pinch-and-swipe gestures.</p><p>“This project thus addresses a direct need in our society by providing an integrated educational experience, focused on math education that addresses the diverse needs of children, while providing a solution for variations found in their disabilities,” the company wrote in its grant proposal. “This SBIR Phase II project addresses an unmet need by developing an innovative solution to enable children with motor disabilities access to mobile devices and apps that could engage them fully into the educational system.”</p><p>In this next phase, Howard and her team plan to design accessible math apps geared to children with or without disabilities in kindergarten through 12th grade. The company also plans to&nbsp;design another set of apps that adapt educational content and provide feedback to parents and teachers based on real-time analytics.</p><p>The company says it sees ample market opportunity for its products both domestically and abroad. Here in the United States, children with disabilities are entitled to a free and appropriate public education, and Zyrobotics sees its products as addressing that need from both a commercial and societal standpoint. Worldwide, more than&nbsp;93 million children live with a disability.</p><p>When founded, the company went through Georgia Tech’s&nbsp;VentureLab&nbsp;startup incubator, ranked No. 2 in North America. VentureLab, a unit of Tech’s Enterprise Innovation Institute (EI<sup>2</sup>), works with Georgia Tech faculty, students, and staff to help them validate and commercialize their research and ideas into viable companies.</p><p>Zyrobotics is now part of Tech’s Advanced Technology Development Center (ATDC), a sister startup incubator program that serves all of Georgia. Zyrobotics, with the help of ATDC’s SBIR program, was able to receive its Phase I award in 2015, laying the groundwork for the Phase II grant.</p><p>“Zyrobotics is a wonderful Georgia Tech startup, based on the fine research in Dr. Howard’s lab, and enhanced by a very successful journey through the NSF I-Corps program,” said Keith McGreggor, VentureLab’s director. “This is a great example of how the research done in the classroom and lab, followed by idea validation, can lead to real breakthroughs that are designed to have a lasting impact on the lives touched by the technologies that Dr. Howard has created.”</p><p>— Péralte C. Paul</p>]]></body>  <author>Péralte Paul</author>  <status>1</status>  <created>1455815270</created>  <gmt_created>2016-02-18 17:07:50</gmt_created>  <changed>1475896849</changed>  <gmt_changed>2016-10-08 03:20:49</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Focus is continued development of accessible education platforms for children with special needs.]]></teaser>  <type>news</type>  <sentence><![CDATA[Focus is continued development of accessible education platforms for children with special needs.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2016-02-18T00:00:00-05:00</dateline>  <iso_dateline>2016-02-18T00:00:00-05:00</iso_dateline>  <gmt_dateline>2016-02-18 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[peralte.paul@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Laura Diamond</p><p>Georgia Tech Media Relations&nbsp;</p><p><a href="mailto:laura.diamond@gatech.edu">laura.diamond@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>313961</item>      </media>  <hg_media>          <item>          <nid>313961</nid>          <type>image</type>          <title><![CDATA[Ayanna Howard]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ayannahoward131021br295_web.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ayannahoward131021br295_web_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ayannahoward131021br295_web_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ayannahoward131021br295_web_0.jpg?itok=JCrhrR_w]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ayanna Howard]]></image_alt>                    <created>1449244929</created>          <gmt_created>2015-12-04 16:02:09</gmt_created>          <changed>1475895022</changed>          <gmt_changed>2016-10-08 02:50:22</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="139"><![CDATA[Business]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="139"><![CDATA[Business]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="4238"><![CDATA[atdc]]></keyword>          <keyword tid="825"><![CDATA[Ayanna Howard]]></keyword>          <keyword tid="363"><![CDATA[NSF]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="167833"><![CDATA[SBIR]]></keyword>          <keyword tid="4193"><![CDATA[venturelab]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="106361"><![CDATA[Business and Economic Development]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="497321">  <title><![CDATA[Six Finalists Competing for InVenture Prize]]></title>  <uid>27918</uid>  <body><![CDATA[<p>Finalists competing for the 2016 InVenture Prize have invented devices to protect firefighters, give children safe drinking water, and teach us how to play “Stairway to Heaven” on guitar.</p><p>Georgia Tech’s InVenture Prize competition is designed to encourage and support undergraduate students’ interest in innovation and entrepreneurship. Once again, more than 500 students signed up for the competition.</p><p>This year’s six finalist teams have invented ways to make our lives safer, healthier, and a bit more fun. The teams are:</p><p><strong>FireHUD</strong>: A display and data monitor that will track and display real-time information to firefighters in hazardous conditions. The goal is to decrease the level of uncertainty firefighters face.</p><p>Inventors: Zachary Braun, computer engineering; and Tyler Sisk, electrical engineering.</p><p><strong>FretWizard</strong>: A virtual guitar teacher for students at varying levels. The inventors designed the site to give people a simpler and more intuitive way to learn how to play songs on the guitar.</p><p>Inventors: Ali Abid, computer science; and Molly Ricks, international affairs.</p><p><strong>RoboGoalie</strong>: An automatic retrieval device that collects a soccer ball and launches it back to the player. Similar to a batting cage, this device gives soccer players the flexibility of practicing alone.</p><p>Inventors (all mechanical engineering majors): Siu Lun Chan, Ming Him Ko, Zhifeng Su, and Timothy Woo.</p><p><strong>TEQ</strong> <strong>Charging</strong>: A power management system for electric vehicle chargers. The technology and design lowers the cost of installing current charge stations and&nbsp;increases efficiency&nbsp;by sequentially charging vehicles.</p><p>Inventors: Dorrier Coleman, computer engineering; Mitchell Kelman, computer science; Joshua Lieberman, mechanical engineering; and Isaac Wittenstein, mechanical engineering.</p><p><strong>TruePani</strong>: A household sanitation solution, consisting of a passive antimicrobial cup and storage water device that kills harmful microbes in drinking water. This invention was designed for children in rural India who are most affected by waterborne illnesses, but it also could be used in underserved communities worldwide.</p><p>Inventors: Samantha Becker, civil engineering; Sarah Lynn Bowen, business administration; Naomi Ergun, business administration; and Shannon Evanchec, environmental engineering.</p><p><strong>Wobble</strong>: A device to test a person’s reactive balance. It works like a mechanical bull in that it spins and tilts. It can be programmed to different levels of difficulty, which makes it useful for determining return-to-play protocols for athletes who have suffered a concussion and also for evaluating the risk of falling for elderly patients.</p><p>Inventors: Hailey Brown, mechanical engineering; Matthew Devlin, biomedical engineering; Ana Gomez del Campo, biomedical engineering; and Garrett Wallace, biomedical engineering.</p><p>The winning team scores $20,000 and the second-place team receives $10,000.</p><p>Both first- and second-place finishers will receive free U.S. patent filings by Georgia Tech’s Office of Technology Licensing and a spot in Georgia Tech’s startup accelerator program, Flashpoint.</p><p>A $5,000 People’s Choice Award will go to the fans’ favorite invention. Voting will be by text messaging during the finale.</p><p>The finale will take place March 16 at the Ferst Center for the Arts. Tickets are free and can be requested <a href="http://inventureprize.gatech.edu/inventure-prize-ticket-request-form">here</a>.</p><p>The event will also be aired live on Georgia Public Broadcasting.&nbsp;</p>]]></body>  <author>Laura Diamond</author>  <status>1</status>  <created>1455022316</created>  <gmt_created>2016-02-09 12:51:56</gmt_created>  <changed>1475896838</changed>  <gmt_changed>2016-10-08 03:20:38</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Winners of the annual Georgia Tech contest will be announced March 16]]></teaser>  <type>news</type>  <sentence><![CDATA[Winners of the annual Georgia Tech contest will be announced March 16]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2016-02-10T00:00:00-05:00</dateline>  <iso_dateline>2016-02-10T00:00:00-05:00</iso_dateline>  <gmt_dateline>2016-02-10 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[laura.diamond@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Laura Diamond&nbsp;<br />Georgia Tech Media Relations<br />404-894-6016</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>47390</item>          <item>497161</item>          <item>497171</item>          <item>497221</item>          <item>497251</item>          <item>497201</item>          <item>497271</item>      </media>  <hg_media>          <item>          <nid>47390</nid>          <type>image</type>          <title><![CDATA[InVenture Prize Logo]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tne92353.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tne92353.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tne92353.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tne92353.jpg?itok=ee0HHD7m]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[InVenture Prize Logo]]></image_alt>                    <created>1449175107</created>          <gmt_created>2015-12-03 20:38:27</gmt_created>          <changed>1475894442</changed>          <gmt_changed>2016-10-08 02:40:42</gmt_changed>      </item>          <item>          <nid>497161</nid>          <type>image</type>          <title><![CDATA[FireHUD]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[firehud.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/firehud_0.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/firehud_0.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/firehud_0.png?itok=BoUTa1Eu]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[FireHUD]]></image_alt>                    <created>1455120000</created>          <gmt_created>2016-02-10 16:00:00</gmt_created>          <changed>1475895256</changed>          <gmt_changed>2016-10-08 02:54:16</gmt_changed>      </item>          <item>          <nid>497171</nid>          <type>image</type>          <title><![CDATA[FretWizard]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[fretwizard.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/fretwizard_0.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/fretwizard_0.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/fretwizard_0.png?itok=eOqSGYdg]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[FretWizard]]></image_alt>                    <created>1455120000</created>          <gmt_created>2016-02-10 16:00:00</gmt_created>          <changed>1475895256</changed>          <gmt_changed>2016-10-08 02:54:16</gmt_changed>      </item>          <item>          <nid>497221</nid>          <type>image</type>          <title><![CDATA[RoboGoalie]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[robogoalie.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/robogoalie_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/robogoalie_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/robogoalie_0.jpg?itok=darIXNtj]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[RoboGoalie]]></image_alt>                    <created>1455120000</created>          <gmt_created>2016-02-10 16:00:00</gmt_created>          <changed>1475895256</changed>          <gmt_changed>2016-10-08 02:54:16</gmt_changed>      </item>          <item>          <nid>497251</nid>          <type>image</type>          <title><![CDATA[TEQ Charging - InVenture Prize finalist]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[teq_charging_system_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/teq_charging_system_0_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/teq_charging_system_0_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/teq_charging_system_0_0.jpg?itok=9aZ7M4LM]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[TEQ Charging - InVenture Prize finalist]]></image_alt>                    <created>1455120000</created>          <gmt_created>2016-02-10 16:00:00</gmt_created>          <changed>1475895256</changed>          <gmt_changed>2016-10-08 02:54:16</gmt_changed>      </item>          <item>          <nid>497201</nid>          <type>image</type>          <title><![CDATA[TruePani]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[purepahni_composite_1.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/purepahni_composite_1.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/purepahni_composite_1.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/purepahni_composite_1.png?itok=LyZaggba]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[TruePani]]></image_alt>                    <created>1455120000</created>          <gmt_created>2016-02-10 16:00:00</gmt_created>          <changed>1475895256</changed>          <gmt_changed>2016-10-08 02:54:16</gmt_changed>      </item>          <item>          <nid>497271</nid>          <type>image</type>          <title><![CDATA[Wobble]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[wolbull_tilt.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/wolbull_tilt.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/wolbull_tilt.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/wolbull_tilt.jpg?itok=9sMBQ87z]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Wobble]]></image_alt>                    <created>1455120000</created>          <gmt_created>2016-02-10 16:00:00</gmt_created>          <changed>1475895256</changed>          <gmt_changed>2016-10-08 02:54:16</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://inventureprize.gatech.edu/]]></url>        <title><![CDATA[The InVenture Prize web site]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="139"><![CDATA[Business]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>          <category tid="144"><![CDATA[Energy]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="139"><![CDATA[Business]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>          <term tid="144"><![CDATA[Energy]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="148"><![CDATA[Music and Music Technology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="3472"><![CDATA[entrepreneurship]]></keyword>          <keyword tid="341"><![CDATA[innovation]]></keyword>          <keyword tid="453"><![CDATA[undergraduate research]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39491"><![CDATA[Renewable Bioproducts]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="106361"><![CDATA[Business and Economic Development]]></topic>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>          <topic tid="71891"><![CDATA[Health and Medicine]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>          <topic tid="71901"><![CDATA[Society and Culture]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="468081">  <title><![CDATA[Collaboration with CNN Investigates Use of UAVs for Newsgathering]]></title>  <uid>27303</uid>  <body><![CDATA[<p>In June 2014, the Georgia Tech Research Institute (GTRI) and CNN launched a joint research initiative to study the use of unmanned aerial vehicles (UAVs) for newsgathering. In January 2015, CNN signed an agreement with the Federal Aviation Administration (FAA) to share the results of the research. The project is now gaining momentum as researchers shift their focus from evaluating UAV equipment to developing potential protocols for safe operations.</p><p>The issue: Hobbyists can fly drones without FAA oversight as long as the aircraft weighs 55 pounds or less, flies in unpopulated areas, and remains within line of sight of the operator. Yet flying drones for commercial purposes requires review and approval by the FAA. The only way to get a thumbs-up from the FAA is to pursue airworthiness certification (an expensive and complicated process that can take up to a year), or secure a “Section 333 exemption.”</p><p>A Section 333 exemption allows the FAA to waive the airworthiness requirement as long as the commercial UAV flights are conducted under a number of restrictions. Among these restrictions: Drone operators must notify local aviation authorities two or three days prior to flight — and operations over people or near airports are off-limits.</p><p>“Securing a 333 exemption is doable for the movie industry since obtaining aerial footage can be planned far in advance,” observed Mike Heiges, a <a href="http://www.gtri.gatech.edu/">GTRI</a> principal research engineer who leads the CNN project. “Yet journalists can’t operate under these rules for breaking news and chaotic situations where there may be emergency responders, police helicopters, or the National Guard.”</p><p>Granted, drones aren’t needed for every news story, but they provide a unique perspective in many situations, said Greg Agvent, senior director of news operations for CNN/US.</p><p>“Being able to fly over an area after an earthquake or tornado hits would provide a deeper understanding of how widespread the devastation is,” Agvent explained and pointed to the May 12 Amtrak train derailment in Philadelphia. “Part of the issue with the accident was the speed going into the curve. The ability to get footage from 200 feet in the air would have presented a better sense of the curve — context that you simply couldn’t get from the ground.”</p><p>Safety of news personnel is another benefit of drone journalism, Agvent added. “In many cases, such as a flood, safety would trump context. We could capture footage of an event without putting our people in harm’s way.”</p><p>Some of the research that comes out of the project will be helpful beyond newsgathering, observed Dave Price, a GTRI senior research technologist working on the project. “Commercial drones are of interest for crop monitoring and inspection of bridges and railroad tracks,” he explained. “Railroads and agriculture agencies will be able see the results of CNN’s camera selection and stabilization systems and take advantage of this for their own applications.”</p><p><strong>The Right Stuff</strong></p><p>During the past year, the researchers, including GTRI and CNN staff, have been investigating different UAVs that could carry the type of camera systems journalists need to shoot and transmit aerial footage.</p><p>That’s easier said than done. For one thing, the commercial drone industry is in its infancy. Manufacturers come and go, and there aren’t a great number with a long track record. Another challenge is finding the right equipment — airframes and payloads that match up. “It’s a trade-off,” Heiges explained. “You have to factor in size, weight, and power of what you want to put on the aircraft with what the aircraft can carry.”</p><p>Flight times for many commercial drones aren’t long enough for CNN’s purposes, nor is video quality high enough. “To install a better camera, you need a bigger vehicle for endurance,” Heiges said. “And that means stepping up to UAVs that were developed for the military, which dramatically increases price.”</p><p>GTRI has been testing drones since 2006 through the FAA’s certificate of authorization process, which enables public institutions to operate drones in national airspace for research purposes. Currently, GTRI holds 28 certificates of authorization for specific locations in five states. For the project with CNN, GTRI provides pilots to fly the drones in approved areas, plans the flight tests with CNN’s participation, collects data, and prepares reports with recommendations.</p><p>One of CNN’s takeaways from the flight tests: Drone journalism is no one-person show. “In most cases, especially for live video, you need three people,” Agvent said. This includes a pilot to guide the actions of the UAV and an operator for the camera, which is usually suspended under the drone and sits on gimbals for stabilization.</p><p>“The third person, a spotter, is particularly important in urban areas,” Agvent continued. “The spotter focuses solely on situational awareness and communicates to the pilot about people and other aircraft that may be in the area. In some cases, you could get by with a two- person team — a pilot/cameraman and a spotter — but a trio is best to ensure both high quality and safety.”</p><p><strong>Advancing to Operational Protocols</strong></p><p>“We’ve hit a lot of milestones in the past year,” Agvent said. “Now, we begin to work on the finer points of flight operations and coordinating with air traffic control.”</p><p>One of the FAA’s chief concerns with drones is getting the word out to manned aircraft about a UAV’s presence in the area. The current practice is to file a “notice to airmen” two or three days in advance.</p><p>A new technology known as automatic dependent surveillance-broadcast (ADS-B) could provide a just-in-time alternative to the notice to airmen. Developed by the FAA, this technology enables aircraft to broadcast their GPS coordinates to anyone in the local air space that has ADS-B, and vice-versa, so the drone operator would be able to see other aircraft.</p><p>“It’s like having an air traffic radar map inside your cockpit,” Heiges said. “Even better, unlike conventional radar, ADS-B works all the way to the ground.” That’s important, because, in some situations, journalists may need to cooperate with police helicopters or medical aircraft flying at low altitudes to pick up patients.</p><p>Geo-fencing technologies, which prevent UAVs from entering airport and other restricted areas, could add another layer of safety, Heiges added.</p><p>Because FAA rules prohibit drones from flying over people, crowd-control issues must also be resolved. For example, are journalists responsible for blocking off the area where they wish to fly drones — or do they communicate with on-scene commanders to find out where they can operate?</p><p>Over the next few months, GTRI and CNN will meet with regional emergency responders and other stakeholders to address these questions and develop an operational framework. Then GTRI will work with law enforcement agencies to test the procedures at remote locations. “We’ll hold mock trials and simulate circumstances that would happen in a breaking news situation,” Heiges explained.</p><p>Creating appropriate regulations for various types of UAV flights is important, as the flight landscape has changed dramatically in recent years.</p><p>“When people built radio-controlled airplanes out of balsa wood, they learned the rules for flying and flew aircraft at sanctioned sites,” Heiges said. “Yet in the past few years, we now have multi-rotors and quad-rotors with automatic stabilization that don’t require the same skills. People are flying them out of the box without knowing the rules. That can be dangerous if flown beyond visual range. Any significant accident will set back the industry, punishing those who do follow the rules.”</p><p>Even small drones could cause a helicopter or aircraft to go down if it gets caught in a propeller or pulled into an engine. Indeed, drones have been in the news this past summer for interfering with firefighting efforts in California, including a San Bernadino wildfire where drones operated by curious hobbyists caused fire pilots to pull out of the fray for 30 minutes, allowing the fire to spread.</p><p>“The one thing that doesn’t get talked about enough is the differentiation between hobbyists and commercial drone users — and that most of the problems are caused by laymen,” said Agvent. “Our goal is to create a framework that allows for safe integration of commercial drones for newsgathering. It’s about having trusted vendors, trusted aircraft, and trusted procedures in place to act in a safe manner.”</p><p><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia 30332-0181 USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986) (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>)</p><p><strong>Writer</strong>: T.J. Becker</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1447150519</created>  <gmt_created>2015-11-10 10:15:19</gmt_created>  <changed>1475896798</changed>  <gmt_changed>2016-10-08 03:19:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers from the Georgia Tech Research Institute have been working with CNN to investigate the use of UAVs in newsgathering.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers from the Georgia Tech Research Institute have been working with CNN to investigate the use of UAVs in newsgathering.]]></sentence>  <summary><![CDATA[<p>In June 2014, the Georgia Tech Research Institute (GTRI) and CNN launched a joint research initiative to study the use of unmanned aerial vehicles (UAVs) for newsgathering. In January 2015, CNN signed an agreement with the Federal Aviation Administration (FAA) to share the results of the research. The project is now gaining momentum as researchers shift their focus from evaluating UAV equipment to developing potential protocols for safe operations.</p>]]></summary>  <dateline>2015-11-10T00:00:00-05:00</dateline>  <iso_dateline>2015-11-10T00:00:00-05:00</iso_dateline>  <gmt_dateline>2015-11-10 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>468031</item>          <item>468041</item>      </media>  <hg_media>          <item>          <nid>468031</nid>          <type>image</type>          <title><![CDATA[UAV in CNN World Headquarters]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[cnn-gtri-003.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/cnn-gtri-003_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/cnn-gtri-003_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/cnn-gtri-003_0.jpg?itok=QRfCUpCk]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[UAV in CNN World Headquarters]]></image_alt>                    <created>1449257147</created>          <gmt_created>2015-12-04 19:25:47</gmt_created>          <changed>1475895216</changed>          <gmt_changed>2016-10-08 02:53:36</gmt_changed>      </item>          <item>          <nid>468041</nid>          <type>image</type>          <title><![CDATA[UAV in CNN World Headquarters]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[cnn-gtri-002.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/cnn-gtri-002_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/cnn-gtri-002_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/cnn-gtri-002_0.jpg?itok=mpoL1TkW]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[UAV in CNN World Headquarters]]></image_alt>                    <created>1449257147</created>          <gmt_created>2015-12-04 19:25:47</gmt_created>          <changed>1475895216</changed>          <gmt_changed>2016-10-08 02:53:36</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="139"><![CDATA[Business]]></category>          <category tid="143"><![CDATA[Digital Media and Entertainment]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="139"><![CDATA[Business]]></term>          <term tid="143"><![CDATA[Digital Media and Entertainment]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="496"><![CDATA[CNN]]></keyword>          <keyword tid="4341"><![CDATA[FAA]]></keyword>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="3245"><![CDATA[News]]></keyword>          <keyword tid="147341"><![CDATA[newsgathering]]></keyword>          <keyword tid="1500"><![CDATA[UAV]]></keyword>          <keyword tid="3249"><![CDATA[unmanned aerial vehicle]]></keyword>      </keywords>  <core_research_areas>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71901"><![CDATA[Society and Culture]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="457941">  <title><![CDATA[A Light Touch May Help Animals and Robots Move on Sand and Snow]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Having a light touch can make a hefty difference in how well animals and robots move across challenging granular surfaces such as snow, sand and leaf litter. Research reported October 9 in the journal <em>Bioinspiration &amp; Biomimetics</em> shows how the design of appendages – whether legs or wheels – affects the ability of both robots and animals to cross weak and flowing surfaces.</p><p>Using an air fluidized bed trackway filled with poppy seeds or glass spheres, researchers at the Georgia Institute of Technology systematically varied the stiffness of the ground to mimic everything from hard-packed sand to powdery snow. By studying how running lizards, geckos, crabs – and a robot – moved through these varying surfaces, they were able to correlate variables such as appendage design with performance across the range of surfaces.</p><p>The key measure turned out to be how far legs or wheels penetrated into the surface. What the scientists learned from this systematic study might help future robots avoid getting stuck in loose soil on some distant planet.</p><p>“You need to know systematically how ground properties affect your performance with wheel shape or leg shape, so you can rationally predict how well your robot will be able to move on the surfaces where you have to travel,” said <a href="https://www.physics.gatech.edu/user/daniel-goldman">Dan Goldman</a>, a professor in the Georgia Tech <a href="http://www.physics.gatech.edu">School of Physics</a>. “When the ground gets weak, certain animals seem to still be able to move around independently of the surface properties. We want to understand why.”</p><p>The research was supported by National Science Foundation, Army Research Laboratory and Burroughs Wellcome Fund.</p><p>For years, Goldman and colleagues have been using trackways filled with granular media to study the locomotion of animals and robots, but in the past, they had used fluidized bed only to set the initial compaction of the media. In this study, however, they used variations in continuous air flow – introduced through the bottom of the device – to vary the substrate’s resistance to penetration by a leg or wheel.</p><p>Goldman compares the trackway to the wind tunnels used for aerodynamic studies.</p><p>“By varying the air flow, we can create ground that is very, very weak – so that you sink into it quite easily, like powdery snow, and we can have ground that is very strong, like sand,” he explained. “This gives us the ability to study the mechanism by which animals and robots either succeed or fail.”</p><p>Using a bio-inspired hexapedal robot known as Sandbot as a physical model, the researchers studied average forward speed as a factor of ground penetration resistance – the “stiffness” of the sand – and the frequency of leg movement. The average speed of the robot declined as the increased air flow through the trackway made the surface weaker. Increasing the leg frequency makes the speed decrease more rapidly with increasing air flow.</p><p>The five animals – with different body plans and appendage features – all did better than the robot, with the best performer being a lizard collected in a California desert. The speed of the <em>C. draconoides</em> wasn’t slowed at all as the surface became easier to penetrate, while other animals saw performance losses of between 20 and 50 percent on the loosening surfaces.</p><p>“We think that this particular lizard is well suited to the variety of terrain because it has these ridiculously long feet and toes,” Goldman said. “These feet and toes really enable it to maintain high performance and reduce its penetration into the surface over a wide range of substrate conditions. On the other hand, we see animals like ghost crabs that experience a tremendous loss of performance as the substrate changes, something that was surprising to us.”</p><p>The robot lost 70 percent of its speed even with wheels designed to lighten its pressure on the surface.</p><p>Skiers and beachcombers can certainly understand why. As the surface becomes easier for a ski or foot to penetrate, more energy is required to move and forward progress slows. Human and skiers haven’t evolved solutions to that problem, but desert-dwelling creatures have. The research, Goldman says, will help us understand how they do it.</p><p>“The magic for us is how the animals are so good at this,” he said. “There’s a clear practical application to this. If you can get the controls and morphology right, you could have a robot that could move anywhere, but you have to know what you are doing under different conditions.”</p><p>As part of the research, Georgia Tech graduate students Feifei Qian and Tingnan Zhang used a terradynamics approach based on resistive force theory to perform numerical simulations of the robots and animals. They found that their model successfully predicted locomotor performance for low resistance granular states.</p><p>“This work expands the general applicability of our resistive force theory of terradynamics,” said Goldman. “The resistive force theory, which allows us to compute forces on limbs intruding into the ground, continues to work even in situations where we didn’t think it would work. It expands the applicability of terradynamics to even weaker states of material.”</p><p>In addition to those already mentioned, co-authors include Wyatt Korff from the Howard Hughes Medical Institute in Virginia, Paul Umbanhowar from Northwest University, and Robert Full from the University of California at Berkeley.</p><p><em>This research was supported by the Burroughs Wellcome Fund and by the Army Research Laboratory (ARL) Micro Autonomous Systems and Technology (MAST) Collaborative Technology Alliance (CTA) under cooperative agreement number W911NF-08-2-0004, and by the National Science Foundation Physics of Living Systems CAREER and Student Research Network and ARO Grant No. W911NF-11-1-0514. Any conclusions or opinions expressed are those of the authors and do not necessarily reflect the official views of the sponsoring agencies.</em></p><p>CITATION: Feifei Qian, et al., “Principles of appendage design in robots and animals determining terradynamic performance on flowable ground,” (Bioinspiration &amp; Biomimetics, 2015). <a href="http://dx.doi.org/10.1088/1748-3190/10/5/056014">http://dx.doi.org/10.1088/1748-3190/10/5/056014</a></p><p><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia 30332-0181 USA</strong></p><p><strong>Media Relations Contact</strong>: John Toon (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>) (404-894-6986).<br /><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1444507913</created>  <gmt_created>2015-10-10 20:11:53</gmt_created>  <changed>1475896783</changed>  <gmt_changed>2016-10-08 03:19:43</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A light touch can make a hefty difference in how well animals and robots move across challenging granular surfaces.]]></teaser>  <type>news</type>  <sentence><![CDATA[A light touch can make a hefty difference in how well animals and robots move across challenging granular surfaces.]]></sentence>  <summary><![CDATA[<p>Having a light touch can make a hefty difference in how well animals and robots move across challenging granular surfaces such as snow, sand and leaf litter. Research shows how the design of appendages – whether legs or wheels – affects the ability of both robots and animals to cross weak and flowing surfaces.</p>]]></summary>  <dateline>2015-10-10T00:00:00-04:00</dateline>  <iso_dateline>2015-10-10T00:00:00-04:00</iso_dateline>  <gmt_dateline>2015-10-10 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>457871</item>          <item>457891</item>          <item>457901</item>          <item>457911</item>          <item>457931</item>      </media>  <hg_media>          <item>          <nid>457871</nid>          <type>image</type>          <title><![CDATA[Sandbot robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[appendage-design2171.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/appendage-design2171_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/appendage-design2171_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/appendage-design2171_1.jpg?itok=7dZgvSXo]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sandbot robot]]></image_alt>                    <created>1449256347</created>          <gmt_created>2015-12-04 19:12:27</gmt_created>          <changed>1475895202</changed>          <gmt_changed>2016-10-08 02:53:22</gmt_changed>      </item>          <item>          <nid>457891</nid>          <type>image</type>          <title><![CDATA[Preparing Sandbot robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[appendage-design2185.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/appendage-design2185_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/appendage-design2185_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/appendage-design2185_0.jpg?itok=iW3Q9itj]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Preparing Sandbot robot]]></image_alt>                    <created>1449256347</created>          <gmt_created>2015-12-04 19:12:27</gmt_created>          <changed>1475895202</changed>          <gmt_changed>2016-10-08 02:53:22</gmt_changed>      </item>          <item>          <nid>457901</nid>          <type>image</type>          <title><![CDATA[Preparing Sandbot robot2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[appendage-design2179.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/appendage-design2179_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/appendage-design2179_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/appendage-design2179_0.jpg?itok=PVq6518W]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Preparing Sandbot robot2]]></image_alt>                    <created>1449256347</created>          <gmt_created>2015-12-04 19:12:27</gmt_created>          <changed>1475895202</changed>          <gmt_changed>2016-10-08 02:53:22</gmt_changed>      </item>          <item>          <nid>457911</nid>          <type>image</type>          <title><![CDATA[Sandbot in trackway]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[appendage-design2195.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/appendage-design2195_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/appendage-design2195_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/appendage-design2195_1.jpg?itok=LFKD-gNw]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sandbot in trackway]]></image_alt>                    <created>1449256347</created>          <gmt_created>2015-12-04 19:12:27</gmt_created>          <changed>1475895202</changed>          <gmt_changed>2016-10-08 02:53:22</gmt_changed>      </item>          <item>          <nid>457931</nid>          <type>image</type>          <title><![CDATA[Sandbot closeup]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[appendage-design2224.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/appendage-design2224_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/appendage-design2224_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/appendage-design2224_0.jpg?itok=3VmbwFf6]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sandbot closeup]]></image_alt>                    <created>1449256347</created>          <gmt_created>2015-12-04 19:12:27</gmt_created>          <changed>1475895202</changed>          <gmt_changed>2016-10-08 02:53:22</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="47881"><![CDATA[Dan Goldman]]></keyword>          <keyword tid="144361"><![CDATA[granular surface]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="166937"><![CDATA[School of Physics]]></keyword>          <keyword tid="62221"><![CDATA[terradynamics]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="453061">  <title><![CDATA[Humans on Mars]]></title>  <uid>27828</uid>  <body><![CDATA[<p>Georgia Tech’s researchers are working to make sure humans on Mars aren’t something reserved only for Hollywood. Faculty members are creating the next technologies for future missions, landing locations, and instruments to find life. Their expertise and insight will help guide us all to the next frontier.</p>]]></body>  <author>Melanie Goux</author>  <status>1</status>  <created>1443451301</created>  <gmt_created>2015-09-28 14:41:41</gmt_created>  <changed>1475896780</changed>  <gmt_changed>2016-10-08 03:19:40</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Faculty members are creating the next technologies for future missions, landing locations, and instruments to find life.]]></teaser>  <type>news</type>  <sentence><![CDATA[Faculty members are creating the next technologies for future missions, landing locations, and instruments to find life.]]></sentence>  <summary><![CDATA[<p>Georgia Tech’s researchers are working to make sure humans on Mars aren’t something reserved only for Hollywood. Faculty members are creating the next technologies for future missions, landing locations, and instruments to find life.</p>]]></summary>  <dateline>2015-09-28T00:00:00-04:00</dateline>  <iso_dateline>2015-09-28T00:00:00-04:00</iso_dateline>  <gmt_dateline>2015-09-28 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>453071</item>      </media>  <hg_media>          <item>          <nid>453071</nid>          <type>image</type>          <title><![CDATA[Humans on Mars]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[mars_icon.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/mars_icon_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/mars_icon_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/mars_icon_0.jpg?itok=G342GEEE]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Humans on Mars]]></image_alt>                    <created>1449256297</created>          <gmt_created>2015-12-04 19:11:37</gmt_created>          <changed>1475895197</changed>          <gmt_changed>2016-10-08 02:53:17</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.news.gatech.edu/features/humans-mars]]></url>        <title><![CDATA[Read the full story here:]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="141"><![CDATA[Chemistry and Chemical Engineering]]></category>          <category tid="143"><![CDATA[Digital Media and Entertainment]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="141"><![CDATA[Chemistry and Chemical Engineering]]></term>          <term tid="143"><![CDATA[Digital Media and Entertainment]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="143001"><![CDATA[Amanda Stockton]]></keyword>          <keyword tid="30211"><![CDATA[Bobby Braun]]></keyword>          <keyword tid="142991"><![CDATA[Dave Spencer]]></keyword>          <keyword tid="52181"><![CDATA[James Wray]]></keyword>          <keyword tid="11021"><![CDATA[Lisa Yaszek]]></keyword>          <keyword tid="55511"><![CDATA[Mariel Borowitz]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39431"><![CDATA[Data Engineering and Science]]></term>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39531"><![CDATA[Energy and Sustainable Infrastructure]]></term>          <term tid="39471"><![CDATA[Materials]]></term>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>          <term tid="39541"><![CDATA[Systems]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="449681">  <title><![CDATA[Multitasking Moths]]></title>  <uid>27948</uid>  <body><![CDATA[<p class="p1">It’s difficult enough to see things in the dark, but what if you also had to hover in midair while tracking a flower moving in the wind?</p><p class="p1">That's the challenge the hummingbird-sized hawkmoth&nbsp;must overcome while feeding on the nectar of its favorite flowers.</p><p class="p2">Using high-speed infrared cameras and 3-D-printed robotic flowers, scientists have now learned how this insect juggles these complex sensing and control challenges — all while adjusting to changing light conditions.&nbsp;</p><p class="p2">What the researchers have discovered could help the next generation of small flying robots operate efficiently under a broad range of lighting conditions.&nbsp;</p><p class="p2"><strong>Read more about this fascinating study in the <em>Research Horizons</em> story, <a href="http://www.rh.gatech.edu/features/multitasking-moths">Multitasking Moths</a></strong>.</p>]]></body>  <author>Jennifer Tomasino</author>  <status>1</status>  <created>1442586040</created>  <gmt_created>2015-09-18 14:20:40</gmt_created>  <changed>1475896773</changed>  <gmt_changed>2016-10-08 03:19:33</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[How the hawkmoth tracks flowers in the dark has surprising applications for airborne robots.]]></teaser>  <type>news</type>  <sentence><![CDATA[How the hawkmoth tracks flowers in the dark has surprising applications for airborne robots.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2015-09-18T00:00:00-04:00</dateline>  <iso_dateline>2015-09-18T00:00:00-04:00</iso_dateline>  <gmt_dateline>2015-09-18 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[john.toon@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Director of Research News<br /><strong>Phone:</strong>&nbsp;404.894.6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>449571</item>          <item>413151</item>      </media>  <hg_media>          <item>          <nid>449571</nid>          <type>image</type>          <title><![CDATA[Hawkmoth]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[multitasking-moths.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/multitasking-moths_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/multitasking-moths_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/multitasking-moths_0.jpg?itok=XesXjLjb]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Hawkmoth]]></image_alt>                    <created>1449256264</created>          <gmt_created>2015-12-04 19:11:04</gmt_created>          <changed>1475895192</changed>          <gmt_changed>2016-10-08 02:53:12</gmt_changed>      </item>          <item>          <nid>413151</nid>          <type>image</type>          <title><![CDATA[Simon Sponberg with hawkmoth]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[hawkmoth12.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/hawkmoth12_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/hawkmoth12_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/hawkmoth12_0.jpg?itok=B-GYC__a]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Simon Sponberg with hawkmoth]]></image_alt>                    <created>1449254222</created>          <gmt_created>2015-12-04 18:37:02</gmt_created>          <changed>1475895145</changed>          <gmt_changed>2016-10-08 02:52:25</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="115461"><![CDATA[Applied Physiology]]></keyword>          <keyword tid="128551"><![CDATA[hawkmoth]]></keyword>          <keyword tid="960"><![CDATA[physics]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71911"><![CDATA[Earth and Environment]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="393131">  <title><![CDATA[New robotic vehicle provides a never-before-seen look under Antarctica]]></title>  <uid>27902</uid>  <body><![CDATA[<p><em>Editor's note: Icefin's videos from the seafloor can also be downloaded at the Dropbox link at the bottom of the press release.</em></p><p>A first-of-its-kind robotic vehicle recently dove to depths never before visited under Antarctica’s Ross Ice Shelf and brought back video of life on the seafloor.</p><p>A team of scientists and engineers from the Georgia Institute of Technology assembled the unmanned, underwater vehicle on Antarctica. They deployed (and retrieved) the vehicle through a 12-inch diameter hole through 20 meters of ice and another 500 meters of water to the sea floor.</p><p>The robotic vehicle, called Icefin, carried a scientific payload capable of measuring ocean conditions under the ice. Icefin’s readings of the environment under Antarctica’s ice shelves, and video of the life that thrives in these harsh conditions, will help understand how Antarctica’s ice shelves are changing under warming conditions, and to understand how organisms thrive in cold and light-free environments. The technologies developed for Icefin will also help in the search for life on other planets, namely Europa, a moon of Jupiter. Antarctica’s icy oceans are remarkably similar to Europa’s ice-capped oceans.</p><p>“We built a vehicle that’s a hybrid between the really small probes and the ocean-going vessels, and we can deploy it through bore holes on Antarctica,” said <a href="http://schmidt.eas.gatech.edu/">Britney Schmidt</a>, an assistant professor in the School of Earth and Atmospheric Sciences at the Georgia Tech, and the principal investigator for the Icefin project. “At the same time, we’re advancing hypotheses that we need for Europa and understanding ocean systems here better. We’re also developing and getting comfortable with technologies that make polar science -- and eventually Europa science -- more realistic."<strong>&nbsp;</strong></p><p>Icefin was deployed as a part of the Sub Ice Marine and Planetary–analog Ecosystem (SIMPLE) program, funded by NASA and supported by NSF, with Schmidt as the principal&nbsp;investigator. The research team returned from Antarctica in December 2014<strong>. </strong>Icefin is planned to make its Arctic debut in summer 2016, with a return to Antarctica that fall, the team hopes (For more images from the mission, visit: <a href="http://bit.ly/1P2hBRx" title="http://bit.ly/1P2hBRx">http://bit.ly/1P2hBRx</a>).</p><p>At McMurdo Station, Schmidt and a team including Georgia Tech scientists and engineers from the Georgia Tech Research Institute (GTRI), led by principal research engineer <a href="http://www.robotics.gatech.edu/team/faculty/west">Mick West</a>, deployed Icefin to explore the underside of the ice shelves flowing off the continent.</p><p>“What truly separates Icefin from some of the other vehicles is that it’s fairly slender, yet still has all of the sensors that the scientists like Britney need,” West said. “Our vehicle has instrumentation aboard both for navigation and ocean science that other vehicles do not.”</p><p>The Southern Ocean can be as deep as 5,000 meters. Icefin is capable of diving 1,500 meters and can perform three-kilometer-long surveys. Previous vehicles in Icefin’s class were rated to a few hundred meters.</p><p>“We saw evidence of a complex community on the sea floor that has never been observed before, and unprecedented detail on the ice-ocean interface that hasn’t been achieved before,” Schmidt said.&nbsp;</p><p>Video captured by Icefin shows eerie footage of an active seafloor 500 meters under the Ross Ice Shelf.</p><p>“Biologists at McMurdo were just amazed at the amount of biology at that location which included sea stars, sponges and anemones that were at the ocean bottom,” West said. “To have our very first deep-ocean dive happen through a small hole in the ice and go all the way to the ocean bottom and get the video we did was pretty amazing.”&nbsp;</p><p>To get to the bottom, Icefin first had to be built. A partnership between research-focused GTRI and academic-focused School of Earth and Atmospheric Sciences (EAS) enabled the team to design, build and deploy Icefin under the ice in less than a year.Traditional design cycles for these types of vehicles typically are two to three years.</p><p>The team had to design for a number of challenges associated with deploying Icefin in such an extreme environment. For example, standard electronics systems are not typically rated to the extreme temperatures found under the Ross Ice Shelf.</p><p>“We had probably 100 contingencies for if something went wrong,” West said. “Through lots of analysis and robust design, we were fortunate not to have to initiate any of them.”<strong>&nbsp;</strong></p><p>Once Icefin was assembled, the vehicle was deployed through a bore hole in the ice that was 12 inches in diameter and 20 meters deep. Bore holes are often drilled on Antarctica for ocean moorings and sediment sampling.</p><p>Traditional underwater vehicles deployed on Antarctica are either “roving eyes” because they carry only a camera, or much larger vehicles that are deployed in the water on the edge of the ice shelf. Icefin fills the gap between these two kinds of vehicles: able to be deployed easily by small teams in any environment, yet still able to record oceanographic information traditionally done by much larger vehicles.</p><p>“Icefin is the most capable small vehicle that’s been down there,” Schmidt said. “What’s really rewarding is that at the same time, we were able to involve some outstanding students in the design, build and deployment of the vehicle.”&nbsp;</p><p>Graduate student Anthony Spears and undergraduate Matthew Meister, as well as Georgia Tech <a href="http://vip.gatech.edu/new/">Vertically Integrated Projects (VIP) program</a> participants, were involved in design of the vehicle. Spears and Meister also played key roles in the field integration and deployment of Icefin, along with EAS postdoctoral fellow Catherine Walker and graduate student Jacob Buffo from Icefin’s science team.</p><p>Icefin carries forward and up/down imaging and sonars and several different sensors. Icefin is also modular, similar to vehicles used on space missions. Scientists can swap sensors or point them in different directions as needed.</p><p>Traditional GPS does not work under the ice, so Icefin uses a navigation system called SLAM (simultaneous localization and mapping) to triangulate its position based on measuring the range and bearing of features on the seafloor or under the ice.</p><p>“Using algorithms such as SLAM allows us to construct a map of the unknown under-ice environment. When you can do that, you can begin to get a 3D picture of what’s going on under the water,” West said.</p><p>The sensors on Icefin are helping scientists understand how the ocean affects properties of the ice, and how the ice affects properties of the ocean. The exchange between ocean and ice is a process that mediates biology, affects the climate system and controls the stability of glaciers.&nbsp;</p><p>“Those are important processes that we can work out here in our backyard at the same time as we’re answering how an ice shell would reflect the ocean chemistry on Europa,” Schmidt said. “The ice shell is built out of the ocean, but how that process works is not well understood.”</p><p>Videos from the seafloor:&nbsp;https://www.dropbox.com/sh/qn2j1q9qf3rqdto/AACn5xE17456hQK43XHj0RBRa?dl=0</p><p>Photos from the mission:&nbsp;https://www.flickr.com/photos/georgiatech/sets/72157650356164390/with/16626135435/</p><p>For more on explorers at Georgia Tech, see the feature story in the spring issue of Research Horizons magazine:&nbsp;http://www.rh.gatech.edu/features/explorers</p><p><em>This research is supported by Georgia Institute of Technology and the School of Earth and Atmospheric sciences through Schmidt’s startup funds, and partnership with GTRI. Icefin deployed to Antarctica with SIMPLE funded by NASA through grant NNX12AL65G. Deployment was supported by the National Science Foundation (NSF) under project B259. Any conclusions or opinions are those of the authors and do not necessarily represent the official views of the sponsoring agencies.</em></p><p><strong>Research News<br /> Georgia Institute of Technology<br /> 177 North Avenue<br /> Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA<br /> </strong><strong><a href="https://twitter.com/GTResearchNews">@GTResearchNews</a></strong></p><p><strong>Media Relations Contacts</strong>: Brett Israel (<a href="https://twitter.com/btiatl">@btiatl</a>) (404-385-1933) (<a href="mailto:brett.israel@comm.gatech.edu">brett.israel@comm.gatech.edu</a>) or John Toon (404-894-6986) (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>)</p><p><strong>Writer</strong>: Brett Israel</p>]]></body>  <author>Brett Israel</author>  <status>1</status>  <created>1427966147</created>  <gmt_created>2015-04-02 09:15:47</gmt_created>  <changed>1475896678</changed>  <gmt_changed>2016-10-08 03:17:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A first-of-its-kind robotic vehicle recently dove to depths never before visited under Antarctica’s Ross Ice Shelf and brought back video of life on the seafloor.]]></teaser>  <type>news</type>  <sentence><![CDATA[A first-of-its-kind robotic vehicle recently dove to depths never before visited under Antarctica’s Ross Ice Shelf and brought back video of life on the seafloor.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2015-04-02T00:00:00-04:00</dateline>  <iso_dateline>2015-04-02T00:00:00-04:00</iso_dateline>  <gmt_dateline>2015-04-02 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[brett.israel@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Brett Israel</p><p><a href="mailto:brett.israel@comm.gatech.edu">brett.israel@comm.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>393641</item>          <item>393631</item>          <item>393111</item>          <item>393121</item>      </media>  <hg_media>          <item>          <nid>393641</nid>          <type>image</type>          <title><![CDATA[Brittle star on the seafloor under the Ross Ice Shelf]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[icefin2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/icefin2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/icefin2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/icefin2.jpg?itok=-WGdoB8C]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Brittle star on the seafloor under the Ross Ice Shelf]]></image_alt>                    <created>1449246332</created>          <gmt_created>2015-12-04 16:25:32</gmt_created>          <changed>1475895110</changed>          <gmt_changed>2016-10-08 02:51:50</gmt_changed>      </item>          <item>          <nid>393631</nid>          <type>image</type>          <title><![CDATA[Icefin spots aquatic on the seafloor under the Ross Ice Shelf]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[icefin1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/icefin1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/icefin1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/icefin1.jpg?itok=x_XRB3PO]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Icefin spots aquatic on the seafloor under the Ross Ice Shelf]]></image_alt>                    <created>1449246332</created>          <gmt_created>2015-12-04 16:25:32</gmt_created>          <changed>1475895110</changed>          <gmt_changed>2016-10-08 02:51:50</gmt_changed>      </item>          <item>          <nid>393111</nid>          <type>image</type>          <title><![CDATA[Icefin on the ice]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[16502879721_35a4e1b446_k.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/16502879721_35a4e1b446_k.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/16502879721_35a4e1b446_k.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/16502879721_35a4e1b446_k.jpg?itok=GqqZvxo2]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Icefin on the ice]]></image_alt>                    <created>1449246332</created>          <gmt_created>2015-12-04 16:25:32</gmt_created>          <changed>1475895110</changed>          <gmt_changed>2016-10-08 02:51:50</gmt_changed>      </item>          <item>          <nid>393121</nid>          <type>image</type>          <title><![CDATA[The view under the Ross Ice Shelf]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[16503546982_bd41c81a0d_o.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/16503546982_bd41c81a0d_o.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/16503546982_bd41c81a0d_o.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/16503546982_bd41c81a0d_o.png?itok=lX3aJCI-]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[The view under the Ross Ice Shelf]]></image_alt>                    <created>1449246332</created>          <gmt_created>2015-12-04 16:25:32</gmt_created>          <changed>1475895110</changed>          <gmt_changed>2016-10-08 02:51:50</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="2082"><![CDATA[aerospace engineering]]></keyword>          <keyword tid="82391"><![CDATA[Antarctica]]></keyword>          <keyword tid="81291"><![CDATA[Britney Schmidt]]></keyword>          <keyword tid="122051"><![CDATA[icefin]]></keyword>          <keyword tid="122041"><![CDATA[mick west]]></keyword>          <keyword tid="123231"><![CDATA[ross ice shelf]]></keyword>          <keyword tid="122061"><![CDATA[underwater vehicle]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71911"><![CDATA[Earth and Environment]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="394231">  <title><![CDATA[Seven Cool Things About Robots]]></title>  <uid>27948</uid>  <body><![CDATA[<p>In honor of National Robotics Week 2015, we've put together a list of seven cool things robots can do (or will be able to do in the near future).</p><h4><a href="http://www.news.gatech.edu/features/7-cool-things-about-robots"><strong>See the full list</strong></a></h4>]]></body>  <author>Jennifer Tomasino</author>  <status>1</status>  <created>1428399014</created>  <gmt_created>2015-04-07 09:30:14</gmt_created>  <changed>1475896678</changed>  <gmt_changed>2016-10-08 03:17:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[In honor of National Robotics Week, we've put together a list of seven cool things robots can do (or will be able to do in the near future).]]></teaser>  <type>news</type>  <sentence><![CDATA[In honor of National Robotics Week, we've put together a list of seven cool things robots can do (or will be able to do in the near future).]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2015-04-07T00:00:00-04:00</dateline>  <iso_dateline>2015-04-07T00:00:00-04:00</iso_dateline>  <gmt_dateline>2015-04-07 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>394201</item>      </media>  <hg_media>          <item>          <nid>394201</nid>          <type>image</type>          <title><![CDATA[Seven Cool Things About Robots]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[robots-mercury-thumb.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/robots-mercury-thumb.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/robots-mercury-thumb.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/robots-mercury-thumb.jpg?itok=KYMdk-hM]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Seven Cool Things About Robots]]></image_alt>                    <created>1449246346</created>          <gmt_created>2015-12-04 16:25:46</gmt_created>          <changed>1475895110</changed>          <gmt_changed>2016-10-08 02:51:50</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="123391"><![CDATA[national robotics week 2015]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="348981">  <title><![CDATA[Co-robots Team Up with Humans]]></title>  <uid>27303</uid>  <body><![CDATA[<p class="intro-text">Charlie Kemp is giving robots common sense. And that’s good news for Californian Henry Evans.</p><p>Ten years ago, Evans suffered a stroke that left him with limited mobility. Over the past two years, he’s been working with Kemp, an associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University, to develop and test robots that help him shave, adjust a blanket when he’s cold, and even scratch an annoying itch.</p><p>“We did things with the robots that I never could have imagined,” said Evans, who contacted Kemp after seeing him on a CNN broadcast about health care robots.</p><p>Robots working directly with people – even helping them shave – is both challenging and unusual. Most robots today work in manufacturing facilities where, for safety reasons, they stay far away from humans. But Georgia Tech robotics researchers believe people and robots can accomplish much more by working together – as long as the robots have common sense to know, for instance, how much force humans apply when shaving.</p><p>“A major challenge for health care robots is that they lack so much of the knowledge and experience that people take for granted,” said Kemp. “To us, it’s just common sense that everybody has; for robots, it’s a serious impediment.”</p><p>Giving robots common sense is just one milestone on the path to the kinds of collaboration that will be required to meet the needs of a growing population of older persons. Beyond personal care, the benefits of co-robotics are many. To produce better products more efficiently, manufacturing robots will need to team up with humans, each contributing unique abilities. And in defense and homeland security, robots will increasingly have to take on the dangerous jobs, leveraging people’s skills while protecting them from harm.</p><p><a href="http://www.rh.gatech.edu/features/hi-how-can-i-help-you">Read more</a> of this article from Georgia Tech's <em>Research Horizons</em> magazine.</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1416917565</created>  <gmt_created>2014-11-25 12:12:45</gmt_created>  <changed>1475896654</changed>  <gmt_changed>2016-10-08 03:17:34</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Robots are teaming up with humans to perform tasks in manufacturing, health care, national defense and other areas.]]></teaser>  <type>news</type>  <sentence><![CDATA[Robots are teaming up with humans to perform tasks in manufacturing, health care, national defense and other areas.]]></sentence>  <summary><![CDATA[<p>At Georgia Tech, robots are teaming up with humans to perform tasks in manufacturing, health care, national defense and other areas.</p>]]></summary>  <dateline>2014-11-25T00:00:00-05:00</dateline>  <iso_dateline>2014-11-25T00:00:00-05:00</iso_dateline>  <gmt_dateline>2014-11-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>348951</item>          <item>348961</item>          <item>348971</item>      </media>  <hg_media>          <item>          <nid>348951</nid>          <type>image</type>          <title><![CDATA[Swarm robotics - Magnus Egerstedt]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[swarm-robots-cover.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/swarm-robots-cover_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/swarm-robots-cover_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/swarm-robots-cover_0.jpg?itok=dxI75V1J]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Swarm robotics - Magnus Egerstedt]]></image_alt>                    <created>1449245682</created>          <gmt_created>2015-12-04 16:14:42</gmt_created>          <changed>1475895073</changed>          <gmt_changed>2016-10-08 02:51:13</gmt_changed>      </item>          <item>          <nid>348961</nid>          <type>image</type>          <title><![CDATA[Healthcare robotics - Charlie Kemp]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[healthcare-robotics.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/healthcare-robotics_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/healthcare-robotics_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/healthcare-robotics_0.jpg?itok=2wZDF01x]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Healthcare robotics - Charlie Kemp]]></image_alt>                    <created>1449245682</created>          <gmt_created>2015-12-04 16:14:42</gmt_created>          <changed>1475895073</changed>          <gmt_changed>2016-10-08 02:51:13</gmt_changed>      </item>          <item>          <nid>348971</nid>          <type>image</type>          <title><![CDATA[Tutoring robots - Ayanna Howard]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tutoring-robots-ayanna-howard.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tutoring-robots-ayanna-howard_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tutoring-robots-ayanna-howard_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tutoring-robots-ayanna-howard_0.jpg?itok=RM_PeWfu]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Tutoring robots - Ayanna Howard]]></image_alt>                    <created>1449245682</created>          <gmt_created>2015-12-04 16:14:42</gmt_created>          <changed>1475895073</changed>          <gmt_changed>2016-10-08 02:51:13</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="14647"><![CDATA[healthcare robots]]></keyword>          <keyword tid="78271"><![CDATA[IRIM]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>          <keyword tid="110851"><![CDATA[tutoring robots]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="307751">  <title><![CDATA[Your next opponent in Angry Birds could be a robot]]></title>  <uid>27560</uid>  <body><![CDATA[<p>With the help of a smart tablet and Angry Birds, children can now do something typically reserved for engineers and computer scientists: program a robot to learn new skills. The Georgia Institute of Technology project is designed to serve as a rehabilitation tool and to help kids with disabilities.</p><p>The researchers have paired a small humanoid robot with an Android tablet. <a href="http://youtu.be/wNrHwSfA_lo">Kids teach it how to play Angry Birds</a>, dragging their finger on the tablet to whiz the bird across the screen. In the meantime, the robot watches what happens and records “snapshots” in its memory. The machine notices where fingers start and stop, and how the objects on the screen move according to each other, while constantly keeping an eye on the score to check for signs of success.</p><p><a href="http://youtu.be/HAyvBK3-lNE">When it’s the robot’s turn, it mimics the child’s movements and plays the game</a>. If the bird is a dud and doesn’t cause any damage, the robot shakes its head in disappointment. If the building topples and points increase, the eyes light up and the machine celebrates with a happy sound and dance.</p><p>“The robot is able to learn by watching because it knows how interaction with a tablet app is supposed to work,” said Georgia Tech’s Ayanna Howard, Motorola Foundation Professor in the School of Electrical and Computer Engineering who is leading the project. “It recognizes that a person touched here and ended there, then deciphers the information that is important and relevant to its progress.”</p><p>The robot analyzes the new information and provides appropriate social responses while changing its play strategy.</p><p>“One way to get robots more quickly into society is to design them to be flexible for end users,” said Hae Won Park, Howard’s postdoctoral fellow working closely on the project. “If a robot is only trained to perform a specific set of tasks and not able to learn and adapt to its owner or surroundings, its usefulness can become extremely limited.”</p><p>That flexibility is one reason Howard and Park see their robot-smart tablet system as a future rehabilitation tool for children with cognitive and motor-skill disabilities. A clinician could program the robot to cater to a child’s needs, such as turn taking or hand-eye coordination tasks, and then send the machine home.</p><p>Another benefit for rehab: parents don’t always have time or enough patience for repetitive rehabilitation sessions. But a robot never gets tired or bored. &nbsp;</p><p>“Imagine that a child’s rehab requires a hundred arm movements to improve precise hand-coordination movements,” said Howard. “He or she must touch and swipe the tablet repeatedly, something that can be boring and monotonous after a while. But if a robotic friend needs help with the game, the child is more likely to take the time to teach it, even if it requires repeating the same instructions over and over again. The person’s desire to help their ‘friend’ can turn a five-minute, bland exercise into a 30-minute session they enjoy.”</p><p>In a new study, Howard and Park asked grade-school children to play Angry Birds with an adult watching nearby. Afterwards, the kids were asked to teach a robot how to play the game. The children spent an average of nine minutes with the game as the adult watched. They played nearly three times as long (26.5 minutes) with the robot. They also interacted considerably more with the robot than the person. Only 7 percent of their session with the adult included eye contact, gestures and talking. It was nearly 40 percent with the robot.</p><p>The next steps for the Georgia Tech team will include more games for the robot, including Candy Crush and ZyroSky. They will also recruit more children diagnosed with Autism Spectrum Disorder (ASD) and children with motor impairments to interact with the system. Their most recent study included two kids with ASD. Their interaction times with the adult were significantly less than those in the typically developing group. They were about the same with the robot. The findings were presented in June at the <a href="http://www.resna.org/conference/">Rehabilitation Engineering and Assistive Technology Society of North America (RESNA) 2014 Annual Conference</a> in Denver. &nbsp;</p><p><em>This research was partially supported by the National Science Foundation (NSF) under grant 1208287. Any conclusions expressed are those of the principal investigator and may not necessarily represent the official views of the NSF.</em></p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1404986775</created>  <gmt_created>2014-07-10 10:06:15</gmt_created>  <changed>1475896605</changed>  <gmt_changed>2016-10-08 03:16:45</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[With the help of a smart tablet and Angry Birds, end users can now program a robot to learn new tasks.]]></teaser>  <type>news</type>  <sentence><![CDATA[With the help of a smart tablet and Angry Birds, end users can now program a robot to learn new tasks.]]></sentence>  <summary><![CDATA[<p>With the help of a smart tablet and Angry Birds, children can now do something typically reserved for engineers and computer scientists: program a robot to learn new skills. The Georgia Institute of Technology project is designed to serve as a rehabilitation tool and to help kids with disabilities.</p>]]></summary>  <dateline>2014-07-10T00:00:00-04:00</dateline>  <iso_dateline>2014-07-10T00:00:00-04:00</iso_dateline>  <gmt_dateline>2014-07-10 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Georgia Tech team pairs humanoid with popular game to help  kids with rehabilitation]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />National Media Relations<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-385-2966</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>307701</item>          <item>307691</item>          <item>307711</item>          <item>307721</item>      </media>  <hg_media>          <item>          <nid>307701</nid>          <type>image</type>          <title><![CDATA[Robot Plays Angry Birds 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[screen_shot_2014-07-10_at_9.43.09_am.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/screen_shot_2014-07-10_at_9.43.09_am_0.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/screen_shot_2014-07-10_at_9.43.09_am_0.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/screen_shot_2014-07-10_at_9.43.09_am_0.png?itok=nFE9qTrf]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Robot Plays Angry Birds 2]]></image_alt>                    <created>1449244708</created>          <gmt_created>2015-12-04 15:58:28</gmt_created>          <changed>1475895017</changed>          <gmt_changed>2016-10-08 02:50:17</gmt_changed>      </item>          <item>          <nid>307691</nid>          <type>image</type>          <title><![CDATA[Robot Plays Angry Birds]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[screen_shot_2014-07-10_at_9.41.03_am.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/screen_shot_2014-07-10_at_9.41.03_am_0.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/screen_shot_2014-07-10_at_9.41.03_am_0.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/screen_shot_2014-07-10_at_9.41.03_am_0.png?itok=LVCCWrll]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Robot Plays Angry Birds]]></image_alt>                    <created>1449244708</created>          <gmt_created>2015-12-04 15:58:28</gmt_created>          <changed>1475895017</changed>          <gmt_changed>2016-10-08 02:50:17</gmt_changed>      </item>          <item>          <nid>307711</nid>          <type>image</type>          <title><![CDATA[Robot Plays Angry Birds 3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[screen_shot_2014-07-10_at_9.44.03_am.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/screen_shot_2014-07-10_at_9.44.03_am_0.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/screen_shot_2014-07-10_at_9.44.03_am_0.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/screen_shot_2014-07-10_at_9.44.03_am_0.png?itok=mv6c3ZfR]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Robot Plays Angry Birds 3]]></image_alt>                    <created>1449244708</created>          <gmt_created>2015-12-04 15:58:28</gmt_created>          <changed>1475895017</changed>          <gmt_changed>2016-10-08 02:50:17</gmt_changed>      </item>          <item>          <nid>307721</nid>          <type>image</type>          <title><![CDATA[Robot Plays Angry Birds 4]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[hae_with_robot065.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/hae_with_robot065_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/hae_with_robot065_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/hae_with_robot065_0.jpg?itok=rz2J5Erh]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Robot Plays Angry Birds 4]]></image_alt>                    <created>1449244708</created>          <gmt_created>2015-12-04 15:58:28</gmt_created>          <changed>1475895017</changed>          <gmt_changed>2016-10-08 02:50:17</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.ece.gatech.edu/]]></url>        <title><![CDATA[School of Electrical and Computer Engineering]]></title>      </link>          <link>        <url><![CDATA[http://www.ece.gatech.edu/faculty-staff/fac_profiles/bio.php?id=135]]></url>        <title><![CDATA[Profile]]></title>      </link>          <link>        <url><![CDATA[http://robotics.gatech.edu/]]></url>        <title><![CDATA[Center for Robotics & Intelligent Machines]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="97601"><![CDATA[Angry Birds]]></keyword>          <keyword tid="825"><![CDATA[Ayanna Howard]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>          <keyword tid="166855"><![CDATA[School of Electrical and Computer Engineering]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="266451">  <title><![CDATA[IRI Intros: 5 Questions with Henrik Christensen]]></title>  <uid>27268</uid>  <body><![CDATA[<p><em>You’ve probably heard that Georgia Tech has a number of&nbsp;<a href="http://www.gatech.edu/research/institutes">Interdisciplinary Research Institutes</a>&nbsp;(IRIs) – but do you know much about them?</em></p><p><em>This article is one in a series of Q&amp;As to introduce the Tech community to the 10 IRIs and their leaders. In this installment, Executive Director of the&nbsp;<a href="http://www.robotics.gatech.edu">Institute for Robotics and Intelligent Machines&nbsp;(IRIM)</a>&nbsp;Henrik Christensen answers questions about IRIM and also talks about&nbsp;<em>its efforts to support Georgia Tech faculty and students.&nbsp;</em></em></p><p>&nbsp;</p><p><strong>Q: What is the</strong><strong> Institute for Robotics and Intelligent Machines (</strong><strong>IRIM), and what are its core research areas?</strong></p><p><strong>A: </strong>The <a href="http://robotics.gatech.edu/">Institute for Robotics and Intelligent Machines</a> is a new IRI that integrates robotics research, education and outreach, and industry engagement across the College of Engineering, the College of Computing, the College of Sciences, and the Georgia Tech Research Institute (GTRI). Our work often involves labs and individual researchers in other Georgia Tech colleges and centers, as well.</p><p>We conduct research in mechanisms, control, perception, artificial intelligence (AI), and human–robot interaction (HRI) with a particular emphasis on human-centered robotics. The question, “How can we build robots that empower people in their daily lives, whether for service in the workplace or in the home, or for enjoyment in a leisure setting?” is central to our work.</p><p>Using robots makes it possible to compete with low-wage manual labor in other countries. It also creates new positions that replace the dirty, dull, and dangerous jobs in U.S. factories. Additionally, robotics technologies have made it possible to improve the quality of life in an aging society by providing services that allow people to remain autonomous as they lose various functions such as mobility and memory. Finally, our research leads to new types of autonomous systems to assist first responders and soldiers during interventions by increasing the distance between responders and the immediate danger, including fires, earthquakes, and explosives.</p><p>IRIM has three objectives: 1) to be the world leader in human-centered robotics, 2) to educate the best people to serve in academia and industry for next-generation robotic systems, and 3) to create new opportunities in robotics for industry and society at large, in both Georgia and beyond.</p><p><strong>Q: A lot seems to be going on in robotics these days. Can you summarize the big trends and Georgia Tech’s role with regard to those trends?</strong></p><p><strong>A:&nbsp;</strong>Robotics has seen tremendous growth in the past few years. Today, robots are used to re-shore jobs to the U.S. in industries such as automotive, aerospace, and electronics manufacturing. We have also seen the development of major new services for the home – from robot vacuum cleaners to autonomous transportation and personal assistance devices. And, of course, we have seen numerous robots used in Iraq and Afghanistan to make life a little safer for our soldiers.</p><p>Overall, we are seeing major growth in manufacturing, e-commerce, health care, and service industries.</p><p>The U.S. recently initiated a number of big programs in robotics, such as the National Robotics Initiative (NRI), which is sponsored jointly by the National Science Foundation, National Institutes of Health, the U.S. Department of Agriculture, and NASA. The NRI was launched on the basis of the <a href="http://robotics.gatech.edu/outreach/roadmap"><em>Roadmap for U.S. Robotics</em>,</a> a report initially published in 2009 and revised in 2013. Georgia Tech served as the coordinator of the development of both editions of this report. To support the NRI, a national network, the <a href="http://robotics.gatech.edu/outreach/VO">Robotics Virtual Organization</a> was founded and is managed by Tech. Consequently, Tech is seen, in many respects, as the leader for the push for new robotics initiatives in the U.S. across research, education, and the translation of results.</p><p><strong>Q:</strong>&nbsp;<strong>How does IRIM support research?</strong></p><p><strong>A:</strong> IRIM supports the research of more than 60 faculty members and 140 graduate students across various colleges and GTRI in a number of ways.</p><p>First, we proactively identify major new funding areas and launch seed projects that allow Georgia Tech to be competitive when calls for proposals are issued. There are remarkably few opportunities for faculty to conduct exploratory research without funding constraints, so we try to identify these new opportunities early and build up results to ensure we can successfully compete for funds.</p><p>Additionally, we are developing an infrastructure that matches researchers with similar interests so, together, they have a more competitive edge when applying for major funding awards. Although our researchers are very good at pursuing grants, it is challenging, as a single applicant, to generate adequate support to build a successful proposal for major funding awards such as NSF’s Engineering Research Centers (ERCs) or Science and Technology Centers (STCs) grants. For example, it is difficult for one faculty member to build a complete manufacturing facility for new robotics research in the automotive industry. However, IRIM can provide a shared infrastructure that allows multiple researchers to pursue a larger research effort in a shared space.</p><p>IRIM is also committed to providing support to faculty pursuing major research opportunities through all phases of the process, from early research efforts and proposal writing to grant management and evaluation of broader impact and outreach. We would rather see our robotics faculty winning a smaller number of major grants rather than a larger number of smaller grants because comparatively, the smaller grants have too much overhead.</p><p>Additionally, IRIM facilitates opportunities for engagement in interdisciplinary activities through events such as weekly seminars and topical workshops throughout the fall and spring semesters.</p><p>Finally, our One Georgia Tech approach allows external stakeholders, especially our industry partners, the chance to work with IRIM to identify the individual or lab on campus that best matches their research needs.</p><p><strong>Q: How is IRIM furthering Georgia Tech’s academic mission?</strong></p><p><strong>A:&nbsp;</strong>&nbsp;Over the past few years, we have built a strong <a href="http://phdrobotics.gatech.edu/">Ph.D. program in robotics</a> in which we currently have close to 50 graduate students enrolled. These students are required to have an interdisciplinary focus and must choose coursework that involves three of five core robotics areas: mechanics, controls, perception, HRI, and AI and autonomy. Our interdisciplinary approach has proven to be very popular with students, as well as with employers.</p><p>Additionally, IRIM is working on the development of a professional master’s program in robotics. Georgia has a strong industry base related to robotics, and many of these companies would welcome the opportunity to have a continuing education program available locally for their employees. A professional master’s program would not only allow us to attract more students to Georgia Tech, it would also build new links to industrial companies from across the state.</p><p>IRIM also actively engages with undergraduate students enrolled in participating units (Interactive Computing, Electrical &amp; Computer Engineering, Mechanical Engineering, Biomedical Engineering, and Aerospace Engineering) through coursework and undergraduate research opportunities. This summer, we are launching an NSF-sponsored Summer Undergraduate Research in Engineering (SURE) Program for students to spend summer on campus to conduct research with robotics faculty and graduate students. We see this program as a strong recruiting mechanism to attract the best students to Georgia Tech for graduate studies.</p><p><strong>Q:&nbsp;How does IRIM support industry engagement and community outreach?</strong></p><p><strong>A: &nbsp;</strong>IRIM has a proven track record of cultivating successful industry partnerships, including those with KUKA, Boeing, General Motors, BMW, PSA Peugeot Citroën, Google, Microsoft, iRobot, and Lockheed Martin.</p><p>Through a strong collaboration across academic units and GTRI, IRIM offers industry partners access to a broad research portfolio, as well as an abundance of beneficial services that span from basic research opportunities to full-product development solutions. Too often, innovations are lost in the abyss between basic research and applications. IRIM has the faculty, processes, and experience to ensure these innovative projects can be successful. Few other academic or research institutions in the U.S. have a comparable scope of expertise and options available to industry.</p><p>For broader community outreach, IRIM works closely with organizations across Georgia and the nation, such as high schools, to provide education on the impact of robotics with regard to everyday living. We do this through initiatives such as the <a href="http://www.robojackets.org/first-kickoff/">FIRST Robotics Competition</a>. The undergraduate robotics club, <a href="http://www.robojackets.org/">RoboJackets</a>, with support from IRIM, organizes the annual kickoff for this competition. In 2013, more than 1,000 high school students attended the event at Ferst Center for the Arts, and quite a few Georgia Tech students and faculty members are mentors for the FIRST team.</p><p>Additionally, in an effort to stimulate general interest in STEM subjects, as well as a specific interest in robotics, IRIM organizes regular school visits across Georgia during the year. Since the launch of <a href="http://robotics.gatech.edu/outreach/NRW">National Robotics Week</a> in 2010, IRIM has participated annually by sponsoring an open house at Tech and conducting lab tours and demonstrations for middle and high school students. More than 400 students participated in Tech's 2013 event held on April 11, with one group traveling from Tennessee to attend. Tours offered participants a chance to learn more about 46 different research projects in 16 different robotics labs on campus. We anticipate the 2014 event will be even bigger and better than last year!</p><p><strong>&nbsp;</strong></p>]]></body>  <author>Kirk Englehardt</author>  <status>1</status>  <created>1389628152</created>  <gmt_created>2014-01-13 15:49:12</gmt_created>  <changed>1475896540</changed>  <gmt_changed>2016-10-08 03:15:40</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Executive Director of the Institute for Robotics and Intelligent Machines (IRIM) Henrik Christensen answers questions about IRIM and also talks about its efforts to support Georgia Tech faculty and students.]]></teaser>  <type>news</type>  <sentence><![CDATA[Executive Director of the Institute for Robotics and Intelligent Machines (IRIM) Henrik Christensen answers questions about IRIM and also talks about its efforts to support Georgia Tech faculty and students.]]></sentence>  <summary><![CDATA[<p>IRI Intros Q&amp;A: Institute for Robotics and Intelligent Machines (IRIM)</p><p><em>You’ve probably heard that Georgia Tech has a number of <a href="http://www.gatech.edu/research/institutes">Interdisciplinary Research Institutes</a> (IRIs) – but do you know much about them? </em></p><p><em>This article is one in a series of Q&amp;As to introduce the Tech community to the 10 IRIs and their leaders. In this installment, Executive Director of the <a href="http://www.robotics.gatech.edu">Institute for Robotics and Intelligent Machines&nbsp;(IRIM)</a>&nbsp;Henrik Christensen answers questions about IRIM and also talks about&nbsp;<em>its efforts to support Georgia Tech faculty and students.&nbsp;</em></em></p>]]></summary>  <dateline>2013-01-13T00:00:00-05:00</dateline>  <iso_dateline>2013-01-13T00:00:00-05:00</iso_dateline>  <gmt_dateline>2013-01-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[kirkeng@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:kirkeng@gatech.edu">Kirk Englehardt</a></p><p>Director, Research Communications</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>266461</item>      </media>  <hg_media>          <item>          <nid>266461</nid>          <type>image</type>          <title><![CDATA[Henrik Christensen]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[christensen-henrik_1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/christensen-henrik_1_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/christensen-henrik_1_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/christensen-henrik_1_0.jpg?itok=zMLblMJG]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Henrik Christensen]]></image_alt>                    <created>1449244039</created>          <gmt_created>2015-12-04 15:47:19</gmt_created>          <changed>1475894953</changed>          <gmt_changed>2016-10-08 02:49:13</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.robotics.gatech.edu/]]></url>        <title><![CDATA[Robotics at Georgia Tech]]></title>      </link>          <link>        <url><![CDATA[http://www.robotics.gatech.edu/team/faculty]]></url>        <title><![CDATA[IRIM Faculty]]></title>      </link>          <link>        <url><![CDATA[http://www.gatech.edu/research/institutes]]></url>        <title><![CDATA[Interdisciplinary Research Institutes]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="42941"><![CDATA[Art Research]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="42941"><![CDATA[Art Research]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="11890"><![CDATA[henrik christensen]]></keyword>          <keyword tid="78811"><![CDATA[Institute for Robotics and Intelligent Machines]]></keyword>          <keyword tid="78271"><![CDATA[IRIM]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="246291">  <title><![CDATA[Humanoid Conference Gives Campus a Look at Robotic Future]]></title>  <uid>27560</uid>  <body><![CDATA[<p>Some of the most sophisticated and advanced robots in the world have arrived on campus for the IEEE-RAS International Conference on Humanoid Robots (<a href="http://www.humanoids2013.com/">Humanoids 2013</a>) at the Historic Academy of Medicine at Georgia Tech. The international event is focused on trends and technology for humanoids in the real world. The three-day conference, from October 15-17, features demonstrations, lectures and tours of Georgia Tech robot labs.</p><p>Georgia Tech Assistant Professor Mike Stilman is the general chair for the conference.</p><p>“This is a very exciting event both for the history of robotics worldwide and for education in engineering for all kids excited about new technology,” he said.</p><p>The demonstrations include Rethink Robotics’ Baxter robot, NAO from Aldebaran and South Korea’s Robotis.</p><p>Georgia Tech’s Ronald Arkin, a Regents Professor in the College of Computing, is hosting one of the conference’s three plenary sessions. He will focus on the ethical questions surrounding the potential creation of robotic platforms with lethal autonomy during a presentation titled “How to Not Build a Terminator.”</p><p>“Given the present pace, direction and funding of humanoid technological development, it seems that the science fiction vision of a Terminator robot is becoming more of a reality,” Arkin said. “Many researchers, perhaps unknowingly or unwittingly, are providing the capabilities to achieve such a platform.”</p><p>Other plenary sessions will discuss how to transfer human skills to robots and structuring robotic thought and action through language in a new form of dialogue.</p><p>This is the first time in three years the annual event has been held in the United States, and the first-ever time in Atlanta. The week will conclude with DARPA’s Robotics&nbsp;Challenge (DRC) Trials Preview Meeting on Friday, October 18, which will provide further details on the DRC Trials in December.</p><p>“This is a very special year for humanoid robotics across the world,” Stilman said. “The Robotics Challenge is leading robots that function as first responders to enter dangerous situations, such as Hurricane Katrina and Japan’s Fukushima Daiichi nuclear disaster. Humanoids with human and super-human capabilities will assist in future rescue missions, saving the lives of both disaster victims and rescue workers.”</p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1381938530</created>  <gmt_created>2013-10-16 15:48:50</gmt_created>  <changed>1475896509</changed>  <gmt_changed>2016-10-08 03:15:09</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Humanoids 2013 conference features some of the world's most sophisticated robots.]]></teaser>  <type>news</type>  <sentence><![CDATA[Humanoids 2013 conference features some of the world's most sophisticated robots.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2013-10-16T00:00:00-04:00</dateline>  <iso_dateline>2013-10-16T00:00:00-04:00</iso_dateline>  <gmt_dateline>2013-10-16 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />Media Relations<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-385-2966</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>246481</item>          <item>246421</item>          <item>246431</item>          <item>246411</item>          <item>246441</item>          <item>246391</item>          <item>246381</item>          <item>246541</item>      </media>  <hg_media>          <item>          <nid>246481</nid>          <type>image</type>          <title><![CDATA[Humanoids 2013 NAO]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[humanoids-011_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/humanoids-011_0_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/humanoids-011_0_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/humanoids-011_0_0.jpg?itok=0nqwM0NJ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Humanoids 2013 NAO]]></image_alt>                    <created>1449243758</created>          <gmt_created>2015-12-04 15:42:38</gmt_created>          <changed>1475894924</changed>          <gmt_changed>2016-10-08 02:48:44</gmt_changed>      </item>          <item>          <nid>246421</nid>          <type>image</type>          <title><![CDATA[Humanoids 2013 Robothespian]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[humanoids-008.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/humanoids-008_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/humanoids-008_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/humanoids-008_0.jpg?itok=O_QYDAjy]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Humanoids 2013 Robothespian]]></image_alt>                    <created>1449243758</created>          <gmt_created>2015-12-04 15:42:38</gmt_created>          <changed>1475894924</changed>          <gmt_changed>2016-10-08 02:48:44</gmt_changed>      </item>          <item>          <nid>246431</nid>          <type>image</type>          <title><![CDATA[Humanoids 2013 Baxter]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[humanoids-007.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/humanoids-007_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/humanoids-007_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/humanoids-007_0.jpg?itok=1ttBmbkV]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Humanoids 2013 Baxter]]></image_alt>                    <created>1449243758</created>          <gmt_created>2015-12-04 15:42:38</gmt_created>          <changed>1475894924</changed>          <gmt_changed>2016-10-08 02:48:44</gmt_changed>      </item>          <item>          <nid>246411</nid>          <type>image</type>          <title><![CDATA[Humanoids 2013 Zeno]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[humanoids-006.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/humanoids-006_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/humanoids-006_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/humanoids-006_0.jpg?itok=tBa5n7cB]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Humanoids 2013 Zeno]]></image_alt>                    <created>1449243758</created>          <gmt_created>2015-12-04 15:42:38</gmt_created>          <changed>1475894924</changed>          <gmt_changed>2016-10-08 02:48:44</gmt_changed>      </item>          <item>          <nid>246441</nid>          <type>image</type>          <title><![CDATA[Humanoids 2013 Socibot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[humanoids-010.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/humanoids-010_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/humanoids-010_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/humanoids-010_0.jpg?itok=7ftL9Ue1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Humanoids 2013 Socibot]]></image_alt>                    <created>1449243758</created>          <gmt_created>2015-12-04 15:42:38</gmt_created>          <changed>1475894924</changed>          <gmt_changed>2016-10-08 02:48:44</gmt_changed>      </item>          <item>          <nid>246391</nid>          <type>image</type>          <title><![CDATA[Humanoids 2013 Lab Tour 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[humanoids-001.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/humanoids-001_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/humanoids-001_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/humanoids-001_0.jpg?itok=4ln6aW7G]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Humanoids 2013 Lab Tour 2]]></image_alt>                    <created>1449243758</created>          <gmt_created>2015-12-04 15:42:38</gmt_created>          <changed>1475894924</changed>          <gmt_changed>2016-10-08 02:48:44</gmt_changed>      </item>          <item>          <nid>246381</nid>          <type>image</type>          <title><![CDATA[Humanoids 2013 Lab Tour 1]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[humanoids-003.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/humanoids-003_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/humanoids-003_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/humanoids-003_0.jpg?itok=owFBr6Ye]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Humanoids 2013 Lab Tour 1]]></image_alt>                    <created>1449243758</created>          <gmt_created>2015-12-04 15:42:38</gmt_created>          <changed>1475894924</changed>          <gmt_changed>2016-10-08 02:48:44</gmt_changed>      </item>          <item>          <nid>246541</nid>          <type>image</type>          <title><![CDATA[Humanoids 2013 Group]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[humanoids2013-groupphoto1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/humanoids2013-groupphoto1_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/humanoids2013-groupphoto1_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/humanoids2013-groupphoto1_0.jpg?itok=tFEg1oHW]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Humanoids 2013 Group]]></image_alt>                    <created>1449243758</created>          <gmt_created>2015-12-04 15:42:38</gmt_created>          <changed>1475894924</changed>          <gmt_changed>2016-10-08 02:48:44</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.humanoids2013.com/]]></url>        <title><![CDATA[Humanoids 2013 Website]]></title>      </link>          <link>        <url><![CDATA[http://robotics.gatech.edu/]]></url>        <title><![CDATA[Center for Robotics & Intelligent Machines]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1270"><![CDATA[conference]]></keyword>          <keyword tid="77191"><![CDATA[Humanoids]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="217981">  <title><![CDATA[GTRI Agile Aperture Antenna Technology is Tested on an Autonomous Ocean Vehicle]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Antenna technology originally developed to quickly send and receive information through a software-defined military radio may soon be used to transmit ocean data from a wave-powered autonomous surface vehicle. The technology, the lowest-power method for maintaining a satellite uplink, automatically compensates for the movement of the antenna as the boat bobs around on the ocean surface.</p><p>The Agile Aperture Antenna technology developed by the Georgia Tech Research Institute (GTRI) is expected to provide a more reliable and faster method of transmitting video, audio and environmental data – such as salinity, temperature, fluorescence and dissolved oxygen – from an ocean vehicle to land via satellite.</p><p>In December 2012, the antenna was attached to a Wave Glider vehicle and placed into the ocean off the coast of Hawaii. The Wave Glider, an autonomous marine robot developed by California-based Liquid Robotics, Inc., uses only the ocean’s endless supply of wave energy for propulsion. The Wave Glider can collect ocean data for a wide range of applications, including meteorology, oceanography, national security and offshore energy. Solar panels on the vehicle power the antenna, which requires only 0.25 watts of power and can switch up to 1,000 beams per second.</p><p>During the demonstration, the antenna maintained a satellite link with a sustained data upload rate of 200 kilobits per second (Kbps) for several hours, despite the Wave Glider rolling and yawing back and forth on the waves. The Agile Aperture Antenna required significantly less power and space to achieve these test results than a gimbaled antenna or a phased array solution.</p><p>“Because the antenna autonomously tracked its own position and orientation relative to the satellite and steered itself to stay connected, it maintained a highly directional antenna beam to the satellite as the craft moved around, which enabled data transfers near the maximum expected rate of 240 Kbps,” said Gregory Kiesel, a GTRI senior research engineer. “Antenna integration was also easy because the craft did not need to communicate with the antenna to maintain the connection.”</p><p>The Agile Aperture Antenna requires less power and takes up less space than traditional antenna solutions including mechanical systems and phased-array antennas. The technology also exhibits higher reliability than mechanical systems and is less expensive than phased-array antennas.</p><p>“The combination of the Wave Glider’s long duration and intelligent autonomy capabilities through GTRI’s new Agile Aperture Antenna provides customers with increased communications precision through the roughest of seas,” said Richard “Scoop” Jackson, director of federal business development with Liquid Robotics. “The availability of the GTRI Agile Aperture Antenna on the Wave Glider SV Series comes at a perfect time when deployment of autonomous surface vehicles for maritime security is rapidly increasing due to the cost and capability advantages.”</p><p>The antenna’s performance can be optimized because it is reconfigurable, which means the electrical structure of the antenna can be easily changed – even while in operation in the field.</p><p>The antenna consists of a thin dielectric substrate that supports an array of square, metallic patches that can be switched on or off as needed to provide the proper configuration. The researchers measure the antenna patterns to determine which switches should be open and which should be closed to optimize the antenna performance.</p><p>“Our biggest challenge with this project has been to quickly control the switches on the antenna in a low-power fashion without impacting antenna performance,” said Kiesel.</p><p>While the antenna remained in a fixed position for the recent demonstration, for future tests the researchers may add a low-power mechanical system to slowly raise the antenna to an operational angle and then stow it to a position flush with the surface of the Wave Glider when the antenna isn’t needed. This technology would make it harder to visually detect the Wave Glider.</p><p>The original antenna technology was developed by GTRI Advanced Concepts Laboratory director Lon Pringle, principal research engineer Jim Maloney and former principal research engineer Paul Friederich.</p><p>“We anticipate that our agile aperture antenna technology will begin wide deployment on unmanned surface vehicles in the next year and on unmanned air vehicles within two years given its advantages of being low power and lightweight,” noted Maloney. &nbsp;</p><p>In addition to those already mentioned, GTRI researchers Don Davis, Matthew Habib, Bill Hunter and Tim Richardson also contributed to this research.</p><p><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia&nbsp; 30332-0181</strong></p><p><strong>Media Relations Contacts</strong>: Lance Wallace (<a href="mailto:lance.wallace@gtri.gatech.edu">lance.wallace@gtri.gatech.edu</a>)(404-407-7280) or John Toon (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>) (404-894-6986).</p><p><strong>Writer</strong>: Abby Robinson</p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1371567497</created>  <gmt_created>2013-06-18 14:58:17</gmt_created>  <changed>1475896463</changed>  <gmt_changed>2016-10-08 03:14:23</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[An antenna designed at Georgia Tech has been being tested on an autonomous ocean vehicle.]]></teaser>  <type>news</type>  <sentence><![CDATA[An antenna designed at Georgia Tech has been being tested on an autonomous ocean vehicle.]]></sentence>  <summary><![CDATA[<p>Antenna technology originally developed to quickly send and receive information through a software-defined military radio may soon be used to transmit ocean data from a wave-powered autonomous surface vehicle. The technology, the lowest-power method for maintaining a satellite uplink, automatically compensates for the movement of the antenna as the boat bobs around on the ocean surface.</p>]]></summary>  <dateline>2013-06-18T00:00:00-04:00</dateline>  <iso_dateline>2013-06-18T00:00:00-04:00</iso_dateline>  <gmt_dateline>2013-06-18 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>217921</item>          <item>217931</item>          <item>217941</item>          <item>217901</item>          <item>217911</item>      </media>  <hg_media>          <item>          <nid>217921</nid>          <type>image</type>          <title><![CDATA[Agile Aperture Antenna]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[agile-aperture.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/agile-aperture_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/agile-aperture_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/agile-aperture_0.jpg?itok=0WBZlHQL]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Agile Aperture Antenna]]></image_alt>                    <created>1449180130</created>          <gmt_created>2015-12-03 22:02:10</gmt_created>          <changed>1475894885</changed>          <gmt_changed>2016-10-08 02:48:05</gmt_changed>      </item>          <item>          <nid>217931</nid>          <type>image</type>          <title><![CDATA[Agile Aperture Antenna2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[agile-aperture96.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/agile-aperture96_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/agile-aperture96_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/agile-aperture96_0.jpg?itok=lH1kqUl_]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Agile Aperture Antenna2]]></image_alt>                    <created>1449180130</created>          <gmt_created>2015-12-03 22:02:10</gmt_created>          <changed>1475894885</changed>          <gmt_changed>2016-10-08 02:48:05</gmt_changed>      </item>          <item>          <nid>217941</nid>          <type>image</type>          <title><![CDATA[Agile Aperture Antenna3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[agile-aperture832.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/agile-aperture832_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/agile-aperture832_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/agile-aperture832_0.jpg?itok=3o2m8NDD]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Agile Aperture Antenna3]]></image_alt>                    <created>1449180130</created>          <gmt_created>2015-12-03 22:02:10</gmt_created>          <changed>1475894885</changed>          <gmt_changed>2016-10-08 02:48:05</gmt_changed>      </item>          <item>          <nid>217901</nid>          <type>image</type>          <title><![CDATA[Wave Glider]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[agile-aperture444.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/agile-aperture444_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/agile-aperture444_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/agile-aperture444_0.jpg?itok=gJAS4aBW]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Wave Glider]]></image_alt>                    <created>1449180130</created>          <gmt_created>2015-12-03 22:02:10</gmt_created>          <changed>1475894885</changed>          <gmt_changed>2016-10-08 02:48:05</gmt_changed>      </item>          <item>          <nid>217911</nid>          <type>image</type>          <title><![CDATA[Wave Glider2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[agile-aperture705.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/agile-aperture705_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/agile-aperture705_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/agile-aperture705_0.jpg?itok=tgvfORsy]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Wave Glider2]]></image_alt>                    <created>1449180130</created>          <gmt_created>2015-12-03 22:02:10</gmt_created>          <changed>1475894885</changed>          <gmt_changed>2016-10-08 02:48:05</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="68051"><![CDATA[Agile Aperture Antenna]]></keyword>          <keyword tid="7264"><![CDATA[autonomous]]></keyword>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="68041"><![CDATA[wave glider]]></keyword>      </keywords>  <core_research_areas>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="216371">  <title><![CDATA[Model Finds Common Muscle Control Patterns Governing the Motion of Swimming Animals]]></title>  <uid>27303</uid>  <body><![CDATA[<p>What do swimmers like trout, eels and sandfish lizards have in common? According to a new study, the similar timing patterns that these animals use to contract their muscles and produce undulatory swimming motions can be explained using a simple model. Scientists have now applied the new model to understand the connection between electrical signals and body movement in the sandfish.</p><p>Most swimming creatures rely on an undulating pattern of body movement to propel themselves through fluids. Though differences in body flexibility may lead to different swimming styles, scientists have found “neuromechanical phase lags” in nearly all swimmers. These lags are characterized by a wave of muscle activation that travels faster down the body than the wave of body curvature.</p><p>A study of the sandfish lizard – which “swims” through sand – led to development of the new model, which researchers believe could also be used to study other swimming animals. Beyond assisting the study of locomotion in a wide range of animals, the findings could also help researchers design efficient swimming robots.</p><p>“A graduate student in our group, Yang Ding, who is now at the University of Southern California, was able to develop a theory that could explain the kinematics of how this animal swims as well as the timing of the nervous system control signals,” said <a href="https://www.physics.gatech.edu/user/daniel-goldman">Daniel Goldman</a>, an associate professor in the <a href="http://www.physics.gatech.edu/">School of Physics</a> at the Georgia Institute of Technology. “For animals swimming in fluids using an undulating movement, there are basic physical constraints on how they must activate their muscles. We think we have uncovered an important mechanism that governs this kind of swimming.”</p><p>The research was reported June 3 in the early edition of the journal <em>Proceedings of the National Academy of Sciences</em>. It was sponsored by the National Science Foundation’s Physics of Living Systems program, the Micro Autonomous Systems and Technology (MAST) program of the Army Research Office, and the Burroughs Wellcome Fund.</p><p>Undulatory locomotion is a gait in which thrust is produced in the opposite direction from a traveling wave of body bending. Because it is so commonly used by animals, this mode of locomotion has been widely used for studying the neuromechanical principles of movement.</p><p>Sarah Sharpe, the paper’s second author and a graduate student in Georgia Tech’s Interdisciplinary Bioengineering Program, led laboratory experiments studying undulatory swimming in sandfish lizards. She used X-ray imaging to visualize how the animals swam through sand that was composed of tiny glass spheres.</p><p>At the same time their swimming movements were being tracked, a set of four hair-thin electrodes implanted in the lizards’ bodies were providing information on when their muscles were activated. The two information sources allowed the researchers to compare the electrical muscle activity to the lizards’ body motion.</p><p>“The lizards propagate a wave of muscle activations, contracting the muscles close to their heads first, then the muscles at the midpoint of their body, then their tail,” said Sharpe. “They send a wave of muscle of contraction down their bodies, which creates a wave of curvature that allows them to swim. This wave of activation travels faster than the wave of curvature down the body, resulting in different timing relationships, known as phase differences, between muscle contracts and bending along the body.”</p><p>Sand acts like a frictional fluid as the sandfish swims through it. However, a sandfish swimming through sand is simpler to model than a fish swimming through water because the sand lacks the vortices and other complex behavior of water – and the friction of the sand eliminates inertia.</p><p>“Theoretically, it is difficult to calculate all of the forces acting on a fish or an eel swimming in a real fluid,” said Goldman. “But for a sandfish, you can calculate pretty much everything.”<br />The relative simplicity of the system allowed the research team – which also included Georgia Tech professor Kurt Wiesenfeld – to develop a simple model showing how the muscle activation relates to motion. The model showed that combining synchronized torques from distant points in the lizards’ bodies with local traveling torques is what creates the neuromechanical phase lag.</p><p>“This is one of the simplest, if not the simplest, models of swimming that reproduces the neuromechanical phase lag phenomenon,” Sharpe said. “All we really had to pay attention to was the external forces acting on an animal’s body. We realized that this timing relationship would emerge for any undulatory animal with distributed forces along its body. Understanding this concept can be used as the foundation to begin understanding timing patterns in all other swimmers.”</p><p>The sandfish swims using a simple single-period sinusoidal wave with constant amplitude. A key finding that facilitated the model’s development was that the sandfish’s body is extremely flexible, allowing internal forces – body stiffness – to be ignored.</p><p>“This animal turns out to be like a little limp noodle,” said Goldman. “Having that result in the theory makes everything else pop out.”</p><p>The model shows that the waveform used by the sandfish should allow it to swim the farthest with the least expenditure of energy. Swimming robots adopting the same waveform should therefore be able to maximize their range.</p><p>Goldman and his colleagues have been studying the sandfish, a native of the northern African desert, for more than six years.</p><p>“Sandfish are among the champions of all sand diggers, swimmers and burrowers,” said Goldman. “This lizard has provided us with an interesting entry point into swimming because its environment is surprisingly simple and behavior is simple. It turns out that this little sand-dweller may be able to tell us things about swimming more generally.”</p><p><em>This research has been supported by the National Science Foundation Physics of Living Systems (PoLS) under grants PHY-0749991 and PHY-1150760, by the U.S. Army Research Laboratory’s (ARL) Micro Autonomous Systems and Technology (MAST) Program under cooperative agreement W911NF-11-1-0514, and by the Burroughs Wellcome Fund Career Award. Any conclusions are those of the authors and do not necessarily represent the official views of the NSF or ARL.</em></p><p><strong>CITATION</strong>: Yang Ding, Sarah Sharpe, Kurt Wiesenfeld and Daniel Goldman, “Emergence of the advancing neuromechanical phase in resistive force dominated medium,” (Proceedings of the National Academy of Sciences, 2013).</p><p><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia&nbsp; 30332-0181</strong><br /><br /><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>)<br /><strong>Writer</strong>: John Toon<br /><br /></p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1370360213</created>  <gmt_created>2013-06-04 15:36:53</gmt_created>  <changed>1475896460</changed>  <gmt_changed>2016-10-08 03:14:20</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new study shows that swimming animals use similar timing patterns to contract their muscles]]></teaser>  <type>news</type>  <sentence><![CDATA[A new study shows that swimming animals use similar timing patterns to contract their muscles]]></sentence>  <summary><![CDATA[<p>What do swimmers like trout, eels and sandfish lizards have in common? According to a new study, the similar timing patterns that these animals use to contract their muscles and produce undulatory swimming motions can be explained using a simple model. Scientists have now applied the new model to understand the connection between electrical signals and body movement in the sandfish.</p>]]></summary>  <dateline>2013-06-04T00:00:00-04:00</dateline>  <iso_dateline>2013-06-04T00:00:00-04:00</iso_dateline>  <gmt_dateline>2013-06-04 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>216341</item>          <item>216351</item>          <item>216361</item>      </media>  <hg_media>          <item>          <nid>216341</nid>          <type>image</type>          <title><![CDATA[X-ray of Sandfish Swimming]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[sandfish5.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/sandfish5_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/sandfish5_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/sandfish5_0.jpg?itok=NUqGf2Gk]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[X-ray of Sandfish Swimming]]></image_alt>                    <created>1449180114</created>          <gmt_created>2015-12-03 22:01:54</gmt_created>          <changed>1475894882</changed>          <gmt_changed>2016-10-08 02:48:02</gmt_changed>      </item>          <item>          <nid>216351</nid>          <type>image</type>          <title><![CDATA[Sandfish Lizard]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[sandfish54.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/sandfish54_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/sandfish54_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/sandfish54_1.jpg?itok=Tz15gpUy]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sandfish Lizard]]></image_alt>                    <created>1449180114</created>          <gmt_created>2015-12-03 22:01:54</gmt_created>          <changed>1475894882</changed>          <gmt_changed>2016-10-08 02:48:02</gmt_changed>      </item>          <item>          <nid>216361</nid>          <type>image</type>          <title><![CDATA[Sandfish Lizard]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[sandfish77.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/sandfish77_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/sandfish77_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/sandfish77_0.jpg?itok=4hL_pRy9]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sandfish Lizard]]></image_alt>                    <created>1449180114</created>          <gmt_created>2015-12-03 22:01:54</gmt_created>          <changed>1475894882</changed>          <gmt_changed>2016-10-08 02:48:02</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="12040"><![CDATA[Daniel Goldman]]></keyword>          <keyword tid="169581"><![CDATA[sandfish]]></keyword>          <keyword tid="166937"><![CDATA[School of Physics]]></keyword>          <keyword tid="167350"><![CDATA[swimming]]></keyword>          <keyword tid="67541"><![CDATA[undulatory swimming]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="213701">  <title><![CDATA[Principles of Ant Locomotion Could Help Future Robot Teams Work Underground]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Future teams of subterranean search and rescue robots may owe their success to the lowly fire ant, a much despised insect whose painful bites and extensive networks of underground tunnels are all-too-familiar to people living in the southern United States.</p><ul><li><a href="http://youtu.be/3TQzY_HRAgE">Watch</a> a YouTube video of this project.</li></ul><p>By studying fire ants in the laboratory using video tracking equipment and X-ray computed tomography, researchers have uncovered fundamental principles of locomotion that robot teams could one day use to travel quickly and easily through underground tunnels. Among the principles is building tunnel environments that assist in moving around by limiting slips and falls, and by reducing the need for complex neural processing.</p><p>Among the study’s surprises was the first observation that ants in confined spaces use their antennae for locomotion as well as for sensing the environment.</p><p>“Our hypothesis is that the ants are creating their environment in just the right way to allow them to move up and down rapidly with a minimal amount of neural control,” said <a href="https://www.physics.gatech.edu/user/daniel-goldman">Daniel Goldman</a>, an associate professor in the <a href="http://www.physics.gatech.edu/">School of Physics</a> at the Georgia Institute of Technology, and one of the paper’s co-authors. “The environment allows the ants to make missteps and not suffer for them. These ants can teach us some remarkably effective tricks for maneuvering in subterranean environments.”</p><p>The research was reported May 20 in the early edition of the journal <em>Proceedings of the National Academy of Sciences</em>. The work was sponsored by the National Science Foundation’s Physics of Living Systems program.</p><p>In a series of studies carried out by graduate research assistant Nick Gravish, groups of fire ants (<em>Solenopsis invicta</em>) were placed into tubes of soil and allowed to dig tunnels for 20 hours. To simulate a range of environmental conditions, Gravish and postdoctoral fellow Daria Monaenkova varied the size of the soil particles from 50 microns on up to 600 microns, and also altered the moisture content from 1 to 20 percent.</p><p>While the variations in particle size and moisture content did produce changes in the volume of tunnels produced and the depth that the ants dug, the diameters of the tunnels remained constant – and comparable to the length of the creatures’ own bodies: about 3.5 millimeters.</p><p>“Independent of whether the soil particles were as large as the animals’ heads or whether they were fine powder, or whether the soil was damp or contained very little moisture, the tunnel size was always the same within a tight range,” said Goldman. “The size of the tunnels appears to be a design principle used by the ants, something that they were controlling for.”</p><p>Gravish believes such a scaling effect allows the ants to make best use of their antennae, limbs and body to rapidly ascend and descend in the tunnels by interacting with the walls and limiting the range of possible missteps.</p><p>“In these subterranean environments where their leg motions are certainly hindered, we see that the speeds at which these ants can run are the same,” he said. “The tunnel size seems to have little, if any, effect on locomotion as defined by speed.”</p><p>The researchers used X-ray computed tomography to study tunnels the ants built in the test chambers, gathering 168 observations. They also used video tracking equipment to collect data on ants moving through tunnels made between two clear plates – much like “ant farms” sold for children – and through a maze of glass tubes of differing diameters.</p><p>The maze was mounted on an air piston that was periodically fired, dropping the maze with a force of as much as 27 times that of gravity. The sudden movement caused about half of the ants in the tubes to lose their footing and begin to fall. That led to one of the study’s most surprising findings: the creatures used their antennae to help grab onto the tube walls as they fell.</p><p>“A lot of us who have studied social insects for a long time have never seen antennae used in that way,” said <a href="http://www.biology.gatech.edu/people/michael-goodisman">Michael Goodisman</a>, a professor in the Georgia Tech <a href="http://www.biology.gatech.edu/">School of Biology</a> and one of the paper’s other co-authors. “It’s incredible that they catch themselves with their antennae. This is an adaptive behavior that we never would have expected.”</p><p>By analyzing ants falling in the glass tubes, the researchers determined that the tube diameter played a key role in whether the animals could arrest their fall.</p><p>In future studies, the researchers plan to explore how the ants excavate their tunnel networks, which involves moving massive amounts of soil. That soil is the source of the large mounds for which fire ants are known.</p><p>While the research focused on understanding the principles behind how ants move in confined spaces, the results could have implications for future teams of small robots.</p><p>“The problems that the ants face are the same kinds of problems that a digging robot working in a confined space would potentially face – the need for rapid movement, stability and safety – all with limited sensing and brain power,” said Goodisman. “If we want to build machines that dig, we can build in controls like these ants have.” &nbsp;</p><p>Why use fire ants for studying underground locomotion?</p><p>“These animals dig virtually non-stop, and they are good, repeatable study subjects,” Goodisman explained. “And they are very convenient for us to study. We can go outside the laboratory door and collect them virtually anywhere.”<br /><br /><em>The research described here has been sponsored by the National Science Foundation (NSF) under grant POLS 095765, and by the Burroughs Wellcome Fund. The findings and conclusions are those of the authors and do not necessarily represent the official views of the NSF.</em></p><p><strong>CITATION</strong>: Nick Gravish, et al., “Climbing, falling and jamming during ant locomotion in confined environments,” (Proceedings of the National Academy of Sciences, 2013).<br /><br /><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia&nbsp; 30332-0181</strong><br /><br /><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>)<br /><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1368996773</created>  <gmt_created>2013-05-19 20:52:53</gmt_created>  <changed>1475896456</changed>  <gmt_changed>2016-10-08 03:14:16</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Future teams of subterranean robots could benefit from research into how ants move in confined spaces.]]></teaser>  <type>news</type>  <sentence><![CDATA[Future teams of subterranean robots could benefit from research into how ants move in confined spaces.]]></sentence>  <summary><![CDATA[<p>Future teams of subterranean search and rescue robots may owe their success to the lowly fire ant, a much despised insect whose painful bites and extensive networks of underground tunnels are all-too-familiar to people living in the southern United States.</p>]]></summary>  <dateline>2013-05-20T00:00:00-04:00</dateline>  <iso_dateline>2013-05-20T00:00:00-04:00</iso_dateline>  <gmt_dateline>2013-05-20 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>213651</item>          <item>213671</item>          <item>213681</item>          <item>213661</item>          <item>213641</item>          <item>213631</item>      </media>  <hg_media>          <item>          <nid>213651</nid>          <type>image</type>          <title><![CDATA[Confined Spaces Locomotion - Researchers]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ant-locomotion142.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ant-locomotion142_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ant-locomotion142_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ant-locomotion142_0.jpg?itok=8paXoraL]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Confined Spaces Locomotion - Researchers]]></image_alt>                    <created>1449180076</created>          <gmt_created>2015-12-03 22:01:16</gmt_created>          <changed>1475894876</changed>          <gmt_changed>2016-10-08 02:47:56</gmt_changed>      </item>          <item>          <nid>213671</nid>          <type>image</type>          <title><![CDATA[Confined Spaces Locomotion - Tubes]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ant-locomotion198.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ant-locomotion198_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ant-locomotion198_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ant-locomotion198_0.jpg?itok=ekyvB9aJ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Confined Spaces Locomotion - Tubes]]></image_alt>                    <created>1449180076</created>          <gmt_created>2015-12-03 22:01:16</gmt_created>          <changed>1475894876</changed>          <gmt_changed>2016-10-08 02:47:56</gmt_changed>      </item>          <item>          <nid>213681</nid>          <type>image</type>          <title><![CDATA[Confined Spaces Locomotion - Ants]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tunneling-ants.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tunneling-ants_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tunneling-ants_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tunneling-ants_0.jpg?itok=Zy4NCzqk]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Confined Spaces Locomotion - Ants]]></image_alt>                    <created>1449180096</created>          <gmt_created>2015-12-03 22:01:36</gmt_created>          <changed>1475894876</changed>          <gmt_changed>2016-10-08 02:47:56</gmt_changed>      </item>          <item>          <nid>213661</nid>          <type>image</type>          <title><![CDATA[Confined Spaces Locomotion - Nests]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ant-locomotion184.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ant-locomotion184_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ant-locomotion184_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ant-locomotion184_0.jpg?itok=H3inm99F]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Confined Spaces Locomotion - Nests]]></image_alt>                    <created>1449180076</created>          <gmt_created>2015-12-03 22:01:16</gmt_created>          <changed>1475894876</changed>          <gmt_changed>2016-10-08 02:47:56</gmt_changed>      </item>          <item>          <nid>213641</nid>          <type>image</type>          <title><![CDATA[Confined Spaces Locomotion - Team2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ant-locomotion104.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ant-locomotion104_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ant-locomotion104_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ant-locomotion104_0.jpg?itok=Ch90cY8m]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Confined Spaces Locomotion - Team2]]></image_alt>                    <created>1449180076</created>          <gmt_created>2015-12-03 22:01:16</gmt_created>          <changed>1475894876</changed>          <gmt_changed>2016-10-08 02:47:56</gmt_changed>      </item>          <item>          <nid>213631</nid>          <type>image</type>          <title><![CDATA[Confined Spaces Locomotion - Team]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ant-locomotion21.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ant-locomotion21_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ant-locomotion21_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ant-locomotion21_0.jpg?itok=7TwxjWjt]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Confined Spaces Locomotion - Team]]></image_alt>                    <created>1449180076</created>          <gmt_created>2015-12-03 22:01:16</gmt_created>          <changed>1475894876</changed>          <gmt_changed>2016-10-08 02:47:56</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="66521"><![CDATA[ant]]></keyword>          <keyword tid="66511"><![CDATA[confined spaces]]></keyword>          <keyword tid="12040"><![CDATA[Daniel Goldman]]></keyword>          <keyword tid="377"><![CDATA[locomotion]]></keyword>          <keyword tid="11811"><![CDATA[Michael Goodisman]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="166937"><![CDATA[School of Physics]]></keyword>          <keyword tid="168894"><![CDATA[search and rescue]]></keyword>          <keyword tid="66531"><![CDATA[underground]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="208861">  <title><![CDATA[Sea Turtles and FlipperBot Show How to Walk on Granular Surfaces like Sand]]></title>  <uid>27303</uid>  <body><![CDATA[<p>For sea turtle hatchlings struggling to reach the ocean, success may depend on having flexible wrists that allow them to move without disturbing too much sand. A similar wrist also helps a robot known as “FlipperBot” move through a test bed, demonstrating how animals and bio-inspired robots can together provide new information on the principles governing locomotion on granular surfaces.</p><p>Both the baby turtles and FlipperBot run into trouble under the same conditions: traversing granular media disturbed by previous steps. Information from the robot research helped scientists understand why some of the hatchlings they studied experienced trouble, creating a unique feedback loop from animal to robot – and back to animal.</p><p>The research could help robot designers better understand locomotion on complex surfaces and lead biologists to a clearer picture of how sea turtles and other animals like mudskippers use their flippers. The research could also help explain how animals evolved limbs – including flippers – for walking on land.</p><p>The research was published April 24 in the journal <em>Bioinspiration &amp; Biomimetics</em>. The work was supported by the National Science Foundation, the U.S. Army Research Laboratory’s Micro Autonomous Systems and Technology (MAST) Program, the U.S. Army Research Office, and the Burroughs Wellcome Fund.</p><p>“We are looking at different ways that robots can move about on sand,” said <a href="https://www.physics.gatech.edu/user/daniel-goldman">Daniel Goldman</a>, an associate professor in the <a href="http://www.physics.gatech.edu/">School of Physics</a> at the Georgia Institute of Technology. “We wanted to make a systematic study of what makes flippers useful or effective. We’ve learned that the flow of the materials plays a large role in the strategy that can be used by either animals or robots.”</p><p>The research began in 2010 with a six-week study of hatchling loggerhead sea turtles emerging at night from nests on Jekyll Island, one of Georgia’s coastal islands. The research was done in collaboration with the Georgia Sea Turtle Center.</p><p>Nicole Mazouchova, then a graduate student in the Georgia Tech <a href="http://www.biology.gatech.edu/">School of Biology</a>, studied the baby turtles using a trackway filled with beach sand and housed in a truck parked near the beach. She recorded kinematic and biomechanical data as the turtles moved in darkness toward an LED light that simulated the moon.</p><p>Mazouchova and Goldman studied data from the 25 hatchlings, and were surprised to learn that they managed to maintain their speed regardless of the surface on which they were running.</p><p>“On soft sand, the animals move their limbs in such a way that they don’t create a yielding of the material on which they’re walking,” said Goldman. “That means the material doesn’t flow around the limbs and they don’t slip. The surprising thing to us was that the turtles had comparable performance when they were running on hard ground or soft sand.”</p><p>The key to maintaining performance seemed to be the ability of the hatchlings to control their wrists, allowing them to change how they used their flippers under different sand conditions.</p><p>“On hard ground, their wrists locked in place, and they pivoted about a fixed arm,” Goldman explained. “On soft sand, they put their flippers into the sand and the wrist would bend as they moved forward. We decided to investigate this using a robot model.”</p><p>That led to development of FlipperBot, with assistance from Paul Umbanhowar, a research associate professor at Northwestern University. The robot measures about 19 centimeters in length, weighs about 970 grams, and has two flippers driven by servo-motors. Like the turtles, the robot has flexible wrists that allow variations in its movement. To move through a track bed filled with poppy seeds that simulate sand, the robot lifts its flippers up, drops them into the seeds, then moves the flippers backward to propel itself.</p><p>Mazouchova, now a Ph.D. student at Temple University, studied many variations of gait and wrist position and found that the free-moving mechanical wrist also provided an advantage to the robot.</p><p>“In the robot, the free wrist does provide some advantage,” said Goldman. “For the most part, the wrist confers advantage for moving forward without slipping. The wrist flexibility minimizes material yielding, which disturbs less ground. The flexible wrist also allows both the robot and turtles to maintain a high angle of attack for their bodies, which reduces performance-impeding drag from belly friction.”</p><p>The researchers also noted that the robot often failed when limbs encountered material that the same limbs had already disturbed. That led them to re-examine the data collected on the hatchling turtles, some of which had also experienced difficulty walking across the soft sand.</p><p>“When we saw the turtles moving poorly, they appeared to be suffering from the same failure mode that we saw in the robot,” Goldman explained. “When they interacted with materials that had been previously disturbed, they tended to lose performance.”</p><p>Mazouchova and Goldman then worked with Umbanhowar to model the robot’s performance in an effort to predict how the turtle hatchlings should respond to different conditions. The predictions closely matched what was actually observed, closing the loop between robot and animal.</p><p>“The robot study allowed us to test how principles applied to the animals,” Goldman said.</p><p>While the results may not directly improve robot designs, what the researchers learned should contribute to a better understanding of the principles governing movement using flippers. That would be useful to the designers of robots that must swim through water and walk on land.</p><p>“A multi-modal robot might need to use paddles for swimming in water, but it might also need to walk in an effective way on the beach,” Goldman said. “This work can provide fundamental information on what makes flippers good or bad. This information could give robot designers clues to appendage designs and control techniques for robots moving in these environments.”</p><p>The research could ultimately provide clues to how turtles evolved to walk on land with appendages designed for swimming.</p><p>“To understand the mechanics of how the first terrestrial animals moved, you have to understand how their flipper-like limbs interacted with complex, yielding substrates like mud flats,” said Goldman. “We don’t have solid results on the evolutionary questions yet, but this certainly points to a way that we could address these issues.”</p><p><em>This research has been supported by the National Science Foundation under grant CMMI-0825480 and the Physics of Living Systems PoLS program, the U.S. Army Research Laboratory’s (ARL) Micro Autonomous Systems and Technology (MAST) Program under cooperative agreement W911NF-08-2-0004, the U.S. Army Research Office (ARO) and the Burroughs Wellcome Fund Career Award. Any conclusions are those of the authors and do not necessarily represent the official views of the NSF, ARL or ARO.</em></p><p><strong>CITATION</strong>: Nicole Mazouchova, Paul B. Umbanhowar and Daniel I. Goldman, “Flipper-driven terrestrial locomotion of a sea turtle-inspired robot, (Bioinspiration &amp; Biomimetics, 2013).</p><p><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia&nbsp; 30332-0181&nbsp; USA</strong><br /><br /><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).</p><p><strong>Writer</strong>: John Toon<br /><br /></p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1366735945</created>  <gmt_created>2013-04-23 16:52:25</gmt_created>  <changed>1475896448</changed>  <gmt_changed>2016-10-08 03:14:08</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have learned principles for how both robots and turtles move on granular surfaces.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have learned principles for how both robots and turtles move on granular surfaces.]]></sentence>  <summary><![CDATA[<p>Based on a study of both hatchling sea turtles and "FlipperBot" -- a robot with flippers -- researchers have learned principles for how both robots and turtles move on granular surfaces such as sand.</p>]]></summary>  <dateline>2013-04-23T00:00:00-04:00</dateline>  <iso_dateline>2013-04-23T00:00:00-04:00</iso_dateline>  <gmt_dateline>2013-04-23 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>208811</item>          <item>208801</item>          <item>208791</item>          <item>208821</item>          <item>208781</item>          <item>208831</item>      </media>  <hg_media>          <item>          <nid>208811</nid>          <type>image</type>          <title><![CDATA[FlipperBot testing4]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[flipper-bot136.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/flipper-bot136_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/flipper-bot136_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/flipper-bot136_0.jpg?itok=6SO1foRr]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[FlipperBot testing4]]></image_alt>                    <created>1449180001</created>          <gmt_created>2015-12-03 22:00:01</gmt_created>          <changed>1475894869</changed>          <gmt_changed>2016-10-08 02:47:49</gmt_changed>      </item>          <item>          <nid>208801</nid>          <type>image</type>          <title><![CDATA[FlipperBot testing3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[flipper-bot80.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/flipper-bot80_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/flipper-bot80_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/flipper-bot80_0.jpg?itok=EQzrjxDs]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[FlipperBot testing3]]></image_alt>                    <created>1449180001</created>          <gmt_created>2015-12-03 22:00:01</gmt_created>          <changed>1475894869</changed>          <gmt_changed>2016-10-08 02:47:49</gmt_changed>      </item>          <item>          <nid>208791</nid>          <type>image</type>          <title><![CDATA[FlipperBot testing2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[flipper-bot66.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/flipper-bot66_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/flipper-bot66_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/flipper-bot66_0.jpg?itok=A3HR7G0J]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[FlipperBot testing2]]></image_alt>                    <created>1449180001</created>          <gmt_created>2015-12-03 22:00:01</gmt_created>          <changed>1475894866</changed>          <gmt_changed>2016-10-08 02:47:46</gmt_changed>      </item>          <item>          <nid>208821</nid>          <type>image</type>          <title><![CDATA[FlipperBot testing5]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[flipper-bot218.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/flipper-bot218_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/flipper-bot218_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/flipper-bot218_0.jpg?itok=EBFoCy8b]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[FlipperBot testing5]]></image_alt>                    <created>1449180001</created>          <gmt_created>2015-12-03 22:00:01</gmt_created>          <changed>1475894869</changed>          <gmt_changed>2016-10-08 02:47:49</gmt_changed>      </item>          <item>          <nid>208781</nid>          <type>image</type>          <title><![CDATA[FlipperBot testing]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[flipper-bot48.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/flipper-bot48_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/flipper-bot48_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/flipper-bot48_0.jpg?itok=b6L1GpTE]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[FlipperBot testing]]></image_alt>                    <created>1449180001</created>          <gmt_created>2015-12-03 22:00:01</gmt_created>          <changed>1475894866</changed>          <gmt_changed>2016-10-08 02:47:46</gmt_changed>      </item>          <item>          <nid>208831</nid>          <type>image</type>          <title><![CDATA[Sea turtle]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[sea-turtle3801.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/sea-turtle3801_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/sea-turtle3801_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/sea-turtle3801_0.jpg?itok=6rKyAgix]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sea turtle]]></image_alt>                    <created>1449180001</created>          <gmt_created>2015-12-03 22:00:01</gmt_created>          <changed>1475894869</changed>          <gmt_changed>2016-10-08 02:47:49</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="59331"><![CDATA[bio-inspired]]></keyword>          <keyword tid="47881"><![CDATA[Dan Goldman]]></keyword>          <keyword tid="64831"><![CDATA[flipper]]></keyword>          <keyword tid="64821"><![CDATA[FlipperBot]]></keyword>          <keyword tid="1357"><![CDATA[granular]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="166937"><![CDATA[School of Physics]]></keyword>          <keyword tid="169569"><![CDATA[sea turtle]]></keyword>      </keywords>  <core_research_areas>          <term tid="39481"><![CDATA[National Security]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="209461">  <title><![CDATA[Piezoelectric “Taxels” Convert Motion to Electronic Signals for Tactile Imaging]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Using bundles of vertical zinc oxide nanowires, researchers have fabricated arrays of piezotronic transistors capable of converting mechanical motion directly into electronic controlling signals. The arrays could help give robots a more adaptive sense of touch, provide better security in handwritten signatures and offer new ways for humans to interact with electronic devices.</p><p>The arrays include more than 8,000 functioning piezotronic transistors, each of which can independently produce an electronic controlling signal when placed under mechanical strain. These touch-sensitive transistors – dubbed “taxels” – could provide significant improvements in resolution, sensitivity and active/adaptive operations compared to existing techniques for tactile sensing. Their sensitivity is comparable to that of a human fingertip.</p><p>The vertically-aligned taxels operate with two-terminal transistors. Instead of a third gate terminal used by conventional transistors to control the flow of current passing through them, taxels control the current with a technique called “strain-gating.” Strain-gating based on the piezotronic effect uses the electrical charges generated at the Schottky contact interface by the piezoelectric effect when the nanowires are placed under strain by the application of mechanical force.</p><p>The research was reported April 25 in the journal <em>Science</em> online, at the Science Express website, and will be published in a later version of the print journal. The research has been sponsored by the Defense Advanced Research Projects Agency (DARPA), the National Science Foundation (NSF), the U.S. Air Force (USAF), the U.S. Department of Energy (DOE) and the Knowledge Innovation Program of the Chinese Academy of Sciences.</p><p>“Any mechanical motion, such as the movement of arms or the fingers of a robot, could be translated to control signals,” explained <a href="http://www.mse.gatech.edu/faculty-staff/faculty/zhong-lin-wang">Zhong Lin Wang</a>, a Regents’ professor and Hightower Chair in the <a href="http://www.mse.gatech.edu/">School of Materials Science and Engineering</a> at the Georgia Institute of Technology. “This could make artificial skin smarter and more like the human skin. It would allow the skin to feel activity on the surface.”</p><p>Mimicking the sense of touch electronically has been challenging, and is now done by measuring changes in resistance prompted by mechanical touch. The devices developed by the Georgia Tech researchers rely on a different physical phenomenon – tiny polarization charges formed when piezoelectric materials such as zinc oxide are moved or placed under strain. In the piezotronic transistors, the piezoelectric charges control the flow of current through the wires just as gate voltages do in conventional three-terminal transistors.</p><p>The technique only works in materials that have both piezoelectric and semiconducting properties. These properties are seen in nanowires and thin films created from the wurtzite and zinc blend families of materials, which includes zinc oxide, gallium nitride and cadmium sulfide.</p><p>In their laboratory, Wang and his co-authors – postdoctoral fellow Wenzhuo Wu and graduate research assistant Xiaonan Wen – fabricated arrays of 92 by 92 transistors. The researchers used a chemical growth technique at approximately 85 to 90 degrees Celsius, which allowed them to fabricate arrays of strain-gated vertical piezotronic transistors on substrates that are suitable for microelectronics applications. The transistors are made up of bundles of approximately 1,500 individual nanowires, each nanowire between 500 and 600 nanometers in diameter.</p><p>In the array devices, the active strain-gated vertical piezotronic transistors are sandwiched between top and bottom electrodes made of indium tin oxide aligned in orthogonal cross-bar configurations. A thin layer of gold is deposited between the top and bottom surfaces of the zinc oxide nanowires and the top and bottom electrodes, forming Schottky contacts. A thin layer of the polymer Parylene is then coated onto the device as a moisture and corrosion barrier.</p><p>The array density is 234 pixels per inch, the resolution is better than 100 microns, and the sensors are capable of detecting pressure changes as low as 10 kilopascals – resolution comparable to that of the human skin, Wang said. The Georgia Tech researchers fabricated several hundred of the arrays during a research project that lasted nearly three years.</p><p>The arrays are transparent, which could allow them to be used on touch-pads or other devices for fingerprinting. They are also flexible and foldable, expanding the range of potential uses.</p><p>Among the potential applications:</p><ul><li>Multidimensional signature recording, in which not only the graphics of the signature would be included, but also the pressure exerted at each location during the creation of the signature, and the speed at which the signature is created.</li><li>Shape-adaptive sensing in which a change in the shape of the device is measured. This would be useful in applications such as artificial/prosthetic skin, smart biomedical treatments and intelligent robotics in which the arrays would sense what was in contact with them.</li><li>Active tactile sensing in which the physiological operations of mechanoreceptors of biological entities such as hair follicles or the hairs in the cochlea are emulated.</li></ul><p>Because the arrays would be used in real-world applications, the researchers evaluated their durability. The devices still operated after 24 hours immersed in both saline and distilled water.</p><p>Future work will include producing the taxel arrays from single nanowires instead of bundles, and integrating the arrays onto CMOS silicon devices. Using single wires could improve the sensitivity of the arrays by at least three orders of magnitude, Wang said.</p><p>“This is a fundamentally new technology that allows us to control electronic devices directly using mechanical agitation,” Wang added. “This could be used in a broad range of areas, including robotics, MEMS, human-computer interfaces and other areas that involve mechanical deformation.”</p><p><em>This research was supported by the Defense Advanced Research Projects Agency (DARPA), the National Science Foundation (NSF) under grant CMMI-0946418, the U.S. Air Force (USAF) under grant FA2386-10-1-4070, the U.S. Department of Energy (DOE) Office of Basic Energy Sciences under award DE-FG02-07ER46394 and the Knowledge Innovation Program of the Chinese Academy of Sciences under grant KJCX2-YW-M13. The content is solely the responsibility of the authors and does not necessarily represent the official views of DARPA, the NSF, the USAF or the DOE.</em></p><p><strong>CITATION</strong>: Wenzhuo Wu, Xiaonan Wen, Zhong Lin Wang, “Taxel-addressable matrix of vertical-nanowire piezotronic transistors for active/adaptive tactile imaging,” (Science 2013).</p><p><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia&nbsp; 30332-0181</strong></p><p><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>)</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1366911829</created>  <gmt_created>2013-04-25 17:43:49</gmt_created>  <changed>1475896448</changed>  <gmt_changed>2016-10-08 03:14:08</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have fabricated arrays of piezotronic transistors capable of converting mechanical motion directly into electronic controlling signals.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have fabricated arrays of piezotronic transistors capable of converting mechanical motion directly into electronic controlling signals.]]></sentence>  <summary><![CDATA[<p>Using bundles of vertical zinc oxide nanowires, researchers have fabricated arrays of piezotronic transistors capable of converting mechanical motion directly into electronic controlling signals. The arrays could help give robots a more adaptive sense of touch, provide better security in handwritten signatures and offer new ways for humans to interact with electronic devices.</p>]]></summary>  <dateline>2013-04-25T00:00:00-04:00</dateline>  <iso_dateline>2013-04-25T00:00:00-04:00</iso_dateline>  <gmt_dateline>2013-04-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-8986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>209431</item>          <item>209441</item>          <item>209451</item>      </media>  <hg_media>          <item>          <nid>209431</nid>          <type>image</type>          <title><![CDATA[Piezotronic transistor array]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[piezotronic-arrays31.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/piezotronic-arrays31_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/piezotronic-arrays31_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/piezotronic-arrays31_0.jpg?itok=VNcgI1xY]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezotronic transistor array]]></image_alt>                    <created>1449180001</created>          <gmt_created>2015-12-03 22:00:01</gmt_created>          <changed>1475894869</changed>          <gmt_changed>2016-10-08 02:47:49</gmt_changed>      </item>          <item>          <nid>209441</nid>          <type>image</type>          <title><![CDATA[Piezotronic transistor array2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[peizotronic-arrays148.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/peizotronic-arrays148_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/peizotronic-arrays148_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/peizotronic-arrays148_0.jpg?itok=0RY7UB_a]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezotronic transistor array2]]></image_alt>                    <created>1449180001</created>          <gmt_created>2015-12-03 22:00:01</gmt_created>          <changed>1475894869</changed>          <gmt_changed>2016-10-08 02:47:49</gmt_changed>      </item>          <item>          <nid>209451</nid>          <type>image</type>          <title><![CDATA[Piezotronic transistor array]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[figure2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/figure2_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/figure2_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/figure2_0.jpg?itok=40-ofDB0]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezotronic transistor array]]></image_alt>                    <created>1449180001</created>          <gmt_created>2015-12-03 22:00:01</gmt_created>          <changed>1475894869</changed>          <gmt_changed>2016-10-08 02:47:49</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="7576"><![CDATA[Piezotronic]]></keyword>          <keyword tid="65011"><![CDATA[piezotronic array]]></keyword>          <keyword tid="167535"><![CDATA[School of Materials Science and Engineering]]></keyword>          <keyword tid="64991"><![CDATA[taxel]]></keyword>          <keyword tid="13751"><![CDATA[Zhong Lin Wang]]></keyword>      </keywords>  <core_research_areas>          <term tid="39471"><![CDATA[Materials]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="210251">  <title><![CDATA[Robots Able to Reach through Clutter with Whole-Arm Tactile Sensing]]></title>  <uid>27462</uid>  <body><![CDATA[<p>Whether reaching for a book out of a cluttered cabinet or pruning a bush in the backyard, a person’s arm frequently makes contact with objects during everyday tasks. Animals do it too, when foraging for food, for example.</p><p>Much in the same way, robots are now able to intelligently maneuver within clutter, gently making contact with objects while accomplishing a task. This new control method has wide applications, ranging from robots for search-and-rescue operations to assistive robotics for people with disabilities.</p><p>“Up until now, the dominant strategies for robot manipulation have discouraged contact between the robot’s arm and the world,” said Charlie Kemp, lead researcher and associate professor in the <a href="http://www.bme.gatech.edu/">Coulter Department of Biomedical Engineering at Georgia Tech and Emory University</a>. “Instead of avoiding contact, our approach enables the arm to make contact with objects, people and the rest of the robot while keeping forces low.”</p><p>Kemp, director of Georgia Tech’s Healthcare Robotics Lab, along with his graduate students and researchers at <a href="http://mekabot.com/">Meka Robotics</a>, has&nbsp;developed a control method that works in tandem with compliant robotic joints and whole-arm tactile sensing. This technology keeps the robot’s arm flexible and gives the robot a sense of touch across its entire arm.</p><p>With their control method, Kemp’s robots have performed numerous tasks, such as reaching through dense artificial foliage and a cinder block representative of environments that search-and-rescue robots can encounter.</p><p>A publication describing the research, <a href="http://intl-ijr.sagepub.com/content/32/4/458">“Reaching in Clutter with Whole-arm Tactile Sensing</a>,” appears in this month’s edition of the <em>International Journal of Robotics Research</em>.&nbsp;</p><p>Kemp's lab also has promising results that could impact the future of assistive robotics. They have developed tactile sensors made out of stretchable fabric that covers the entire arm of a robot. In a preliminary trial with the new control method and sensors, Henry Evans, a person with quadriplegia, used the robot to perform tasks for himself. He was able to pull a blanket over himself and grab a cloth to wipe his face, all while he was in bed at his home.</p><p>This trial was conducted as part of the Robots for Humanity project with Willow Garage. In order to ensure safety, researchers from Kemp’s lab closely monitored the activities. This research has been accepted and will be presented at the <a href="http://depts.washington.edu/uwconf/icorr2013/">International Conference on Rehabilitation Robotics</a> in June.&nbsp; </p><p>“I think it’s a good safety feature because it hardly presses against me even when I tell it to,” Evans said after the trial. “It really feels safe to be close to the robot.”</p><p>Evans was also impressed by how the robot’s arm “just wriggles around obstacles.”</p><p>Kemp’s research team has also released the designs and code for the sensors and controller as <a href="http://www.hsi.gatech.edu/hrl/project_open_source_whole_arm_tactile_sensing.shtml">open source hardware and software</a> so that researchers and hobbyists can build on the work.</p><p>The research is part of an ongoing effort to create a new foundation for robotics, where contact between the robot’s arm and the world is encouraged.</p><p>“Our belief is that this approach is the way of the future for robots,” said Kemp, who is also a member of Georgia Tech’s <a href="http://robotics.gatech.edu/">Center for Robotics and Intelligent Machines</a>. “It is going to allow robots to better operate in our homes, our workplaces and other complex environments.”</p><p><em>This research is funded by the DARPA Maximum Mobility and Manipulation (M3) Contract W911NF-11-1- 603. The assistive technology research is also funded in part by NSF CAREER award IIS-1150157, NSF grant CNS-0958545, an NSF GRFP and Willow Garage.</em></p><p><strong>CITATIONS</strong>: Advait Jain, Marc D Killpack, Aaron Edsinger, and Charles C Kemp. Reaching in Clutter with Whole-arm Tactile Sensing. The International Journal of Robotics Research, April 2013, 32: 458-482, doi:10.1177/0278364912471865</p><p>Phillip M. Grice, Marc D. Killpack, Advait Jain, Sarvagya Vaish, Jeffrey Hawke, and Charles C. Kemp. Whole-arm Tactile Sensing for Beneficial and Acceptable Contact During Robotic Assistance. Accepted to the 13th International Conference on Rehabilitation Robotics (ICORR), June 2013.&nbsp;</p>]]></body>  <author>Liz Klipp</author>  <status>1</status>  <created>1367250499</created>  <gmt_created>2013-04-29 15:48:19</gmt_created>  <changed>1475896448</changed>  <gmt_changed>2016-10-08 03:14:08</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Robots are now able to intelligently maneuver within clutter, gently making contact with objects while accomplishing a task, thanks to technology developed by Dr. Charlie Kemp and the Healthcare Robotics Lab.]]></teaser>  <type>news</type>  <sentence><![CDATA[Robots are now able to intelligently maneuver within clutter, gently making contact with objects while accomplishing a task, thanks to technology developed by Dr. Charlie Kemp and the Healthcare Robotics Lab.]]></sentence>  <summary><![CDATA[<p>Robots are now able to intelligently maneuver within clutter, gently making contact with objects while accomplishing a task, thanks to technology developed by Dr. Charlie Kemp and the Healthcare Robotics Lab.</p><p>&nbsp;</p><p>&nbsp;</p>]]></summary>  <dateline>2013-04-29T00:00:00-04:00</dateline>  <iso_dateline>2013-04-29T00:00:00-04:00</iso_dateline>  <gmt_dateline>2013-04-29 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[klipp@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Georgia Tech Media Relations</strong><br />Laura Diamond<br /><a href="mailto:laura.diamond@comm.gatech.edu">laura.diamond@comm.gatech.edu</a><br />404-894-6016<br />Jason Maderer<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-660-2926</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>210121</item>          <item>210131</item>          <item>210141</item>          <item>210151</item>      </media>  <hg_media>          <item>          <nid>210121</nid>          <type>image</type>          <title><![CDATA[Robots Reaching Through Clutter]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[kemp_robot3.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/kemp_robot3_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/kemp_robot3_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/kemp_robot3_0.jpg?itok=-gD53m6I]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Robots Reaching Through Clutter]]></image_alt>                    <created>1449180018</created>          <gmt_created>2015-12-03 22:00:18</gmt_created>          <changed>1475894869</changed>          <gmt_changed>2016-10-08 02:47:49</gmt_changed>      </item>          <item>          <nid>210131</nid>          <type>image</type>          <title><![CDATA[Robots Reaching Through Clutter - 1]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[kemp_robot4.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/kemp_robot4_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/kemp_robot4_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/kemp_robot4_0.jpg?itok=0posZeHL]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Robots Reaching Through Clutter - 1]]></image_alt>                    <created>1449180018</created>          <gmt_created>2015-12-03 22:00:18</gmt_created>          <changed>1475894869</changed>          <gmt_changed>2016-10-08 02:47:49</gmt_changed>      </item>          <item>          <nid>210141</nid>          <type>image</type>          <title><![CDATA[Robots Reaching Through Clutter - 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[kemp_robot2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/kemp_robot2_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/kemp_robot2_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/kemp_robot2_0.jpg?itok=nsxuCstX]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Robots Reaching Through Clutter - 2]]></image_alt>                    <created>1449180018</created>          <gmt_created>2015-12-03 22:00:18</gmt_created>          <changed>1475894869</changed>          <gmt_changed>2016-10-08 02:47:49</gmt_changed>      </item>          <item>          <nid>210151</nid>          <type>image</type>          <title><![CDATA[Robots Reaching Through Clutter - 3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[kemp_robot1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/kemp_robot1_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/kemp_robot1_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/kemp_robot1_0.jpg?itok=bsAC3j8s]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Robots Reaching Through Clutter - 3]]></image_alt>                    <created>1449180018</created>          <gmt_created>2015-12-03 22:00:18</gmt_created>          <changed>1475894869</changed>          <gmt_changed>2016-10-08 02:47:49</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://healthcare-robotics.com/]]></url>        <title><![CDATA[Healthcare Robotics Lab]]></title>      </link>          <link>        <url><![CDATA[http://charliekemp.com/]]></url>        <title><![CDATA[Website of Dr. Charlie Kemp]]></title>      </link>          <link>        <url><![CDATA[http://www.youtube.com/user/HealthcareRobotics]]></url>        <title><![CDATA[Additional Videos on YouTube]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="2157"><![CDATA[Charlie Kemp]]></keyword>          <keyword tid="594"><![CDATA[college of engineering]]></keyword>          <keyword tid="36141"><![CDATA[Coulter Department of Biomedical Engineering at Georgia Tech and Emory University]]></keyword>          <keyword tid="12319"><![CDATA[Healthcare Robotics Lab]]></keyword>          <keyword tid="65291"><![CDATA[Henry Evans]]></keyword>          <keyword tid="65331"><![CDATA[Meka Robotics]]></keyword>          <keyword tid="65321"><![CDATA[robots reaching in clutter]]></keyword>          <keyword tid="65251"><![CDATA[tactile sensing]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="203421">  <title><![CDATA[Engineering Style of Dance for Robots and People]]></title>  <uid>27462</uid>  <body><![CDATA[<p>A dancing robot is nothing new. A quick search on YouTube will yield videos of robots dancing to Michael Jackson’s Thriller, Gangnam Style, the Macarena and more.</p><p>But at the Georgia Institute of Technology, researchers are taking robots and dance to a higher level.</p><p>Instead of programming a robot to copy an existing dance such as those in the online videos, Amy LaViers, a Ph.D. candidate in electrical and computer engineering, is defining the various styles of human movement and creating algorithms to reproduce them on a humanoid robot.</p><p>What’s more, LaViers has produced a robotic dance performance based on her research, called <a href="http://www.youtube.com/watch?v=_6LqL3S4lDk&amp;feature=youtu.be">“Automaton</a>,” in which a Nao robot and professional dancers explore the notion of “automatic style.”</p><p>The show debuts at 8 p.m. on April 6 in the lower atrium of the G. Wayne Clough Undergraduate Learning Commons. A second showing will be held at 5 p.m. on April 13, also in Clough Commons’ lower atrium, as part of the 2013 TechArts Festival.</p><p>“We are working with such a highly articulated robot that can do so many cool things, yet there are many ways he is limited too,” <a href="http://www.prism.gatech.edu/~alaviers3/">LaViers</a> said. “I do play with that idea of: What can the robot do, and what can the people do? Where are the differences and where are the similarities?”</p><p>LaViers' work exemplifies the intersection of engineering and dance, and could be applied to make robots more useful in everyday life, said <a href="http://users.ece.gatech.edu/~magnus/">Magnus Egerstedt</a>, professor of electrical and computer engineering and LaViers' faculty advisor.</p><p>“When robots are transitioning out of the manufacturing floor and into homes, becoming co-workers instead of tools, they need to understand to a certain degree what it means to be human,” Egerstedt said. “They need to move in a style that makes sense to people, so that’s why we started thinking about how you quantify style.”</p><p>A dancer for most of her life, LaViers thought to combine dance with engineering during her undergraduate senior project at Princeton University. She saw a natural overlap between choreography, an arrangement of steps, and robotic algorithms, an engineering tool that plans robotic movement.&nbsp;</p><p>Robotic movements tend to be stiff and unnatural, but LaViers believes robots should have a range of quality of movement. To achieve this, she is developing quantitative tools that explain what differentiates movements using dance theorist Rudolf Laban’s notion of quality.</p><p>LaViers also examines the basic poses and movements that define a style to quantify differences between genres of movement. What is the difference, for instance, between doing a disco dance and performing ballet? Using a computer program she developed for her thesis, she encodes that information so it can be reproduced on robots. &nbsp;</p><p>“Understanding how humans move is key to developing better techniques and applications to make robots move in a way that humans can relate to. ‘Style’ is part of this – particularly in the arts,” LaViers said.</p><p>LaViers’ research fits into the overall objective of Egerstedt’s lab, the <a href="http://gritslab.gatech.edu/home/">Georgia Robotics and Intelligent Systems (GRITS) </a>lab.&nbsp; The lab aims to produce robotic algorithms that endow robots of all kinds with desirable behavior.</p><p>Having algorithms that mimic human movement in a high-level way could advance the use of robots in real-world settings.&nbsp; For example, it may enable caregiving robots to have more comforting movement that is less intimidating to patients. Style-based measurements may also provide better feedback to patients recovering from physical disabilities or injuries.</p><p>In the “Automaton” piece, LaViers presents choreography generated from the framework in her thesis that is performed by human dancers and automated on the humanoid robot. After the performance, audience members will have a chance to give feedback on their impressions of the movement.&nbsp;</p><p>“I hope the audience thinks of movement and programmed objects a little bit differently after seeing the show,” LaViers said. “I also hope it brings up ideas of technology in our lives today and in the future, when robots may be more commonplace.”&nbsp;</p>]]></body>  <author>Liz Klipp</author>  <status>1</status>  <created>1364824482</created>  <gmt_created>2013-04-01 13:54:42</gmt_created>  <changed>1475896439</changed>  <gmt_changed>2016-10-08 03:13:59</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Amy LaViers, a Ph.D. candidate in electrical and computer engineering, is defining the various styles of human movement and creating algorithms to reproduce them on a humanoid robot.]]></teaser>  <type>news</type>  <sentence><![CDATA[Amy LaViers, a Ph.D. candidate in electrical and computer engineering, is defining the various styles of human movement and creating algorithms to reproduce them on a humanoid robot.]]></sentence>  <summary><![CDATA[<p>Instead of programming a robot to copy an existing dance such as those in the online videos, Amy LaViers, a Ph.D. candidate in electrical and computer engineering, is defining the various styles of human movement and creating algorithms to reproduce them on a humanoid robot.&nbsp;</p>]]></summary>  <dateline>2013-04-01T00:00:00-04:00</dateline>  <iso_dateline>2013-04-01T00:00:00-04:00</iso_dateline>  <gmt_dateline>2013-04-01 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[klipp@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Georgia Tech Media Relations</strong><br />Laura Diamond<br /><a href="mailto:laura.diamond@comm.gatech.edu">laura.diamond@comm.gatech.edu</a><br />404-894-6016<br />Jason Maderer<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-660-2926</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>203441</item>          <item>203451</item>          <item>203481</item>          <item>203511</item>          <item>203491</item>      </media>  <hg_media>          <item>          <nid>203441</nid>          <type>image</type>          <title><![CDATA[Amy LaViers]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[automaton3.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/automaton3_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/automaton3_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/automaton3_0.jpg?itok=nnnSHXRM]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Amy LaViers]]></image_alt>                    <created>1449179952</created>          <gmt_created>2015-12-03 21:59:12</gmt_created>          <changed>1475894859</changed>          <gmt_changed>2016-10-08 02:47:39</gmt_changed>      </item>          <item>          <nid>203451</nid>          <type>image</type>          <title><![CDATA[Automaton - rehearsal]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[automaton4.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/automaton4_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/automaton4_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/automaton4_0.jpg?itok=T8oGFkmv]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Automaton - rehearsal]]></image_alt>                    <created>1449179952</created>          <gmt_created>2015-12-03 21:59:12</gmt_created>          <changed>1475894859</changed>          <gmt_changed>2016-10-08 02:47:39</gmt_changed>      </item>          <item>          <nid>203481</nid>          <type>image</type>          <title><![CDATA[Automaton - rehearsal 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[automaton5.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/automaton5_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/automaton5_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/automaton5_0.jpg?itok=iNHpotn_]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Automaton - rehearsal 2]]></image_alt>                    <created>1449179952</created>          <gmt_created>2015-12-03 21:59:12</gmt_created>          <changed>1475894859</changed>          <gmt_changed>2016-10-08 02:47:39</gmt_changed>      </item>          <item>          <nid>203511</nid>          <type>image</type>          <title><![CDATA[Automaton - rehearsal 3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[13c10317-p1-009.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/13c10317-p1-009_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/13c10317-p1-009_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/13c10317-p1-009_0.jpg?itok=nmSVvydC]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Automaton - rehearsal 3]]></image_alt>                    <created>1449179952</created>          <gmt_created>2015-12-03 21:59:12</gmt_created>          <changed>1475894859</changed>          <gmt_changed>2016-10-08 02:47:39</gmt_changed>      </item>          <item>          <nid>203491</nid>          <type>image</type>          <title><![CDATA[Aldebaran Robotics' Nao]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[13c10317-p1-002.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/13c10317-p1-002_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/13c10317-p1-002_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/13c10317-p1-002_0.jpg?itok=V66Zh-YB]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Aldebaran Robotics' Nao]]></image_alt>                    <created>1449179952</created>          <gmt_created>2015-12-03 21:59:12</gmt_created>          <changed>1475894859</changed>          <gmt_changed>2016-10-08 02:47:39</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.flickr.com/photos/georgiatech/sets/72157633139722835/]]></url>        <title><![CDATA[Automaton - flickr gallery]]></title>      </link>          <link>        <url><![CDATA[http://www.arts.gatech.edu/connect/news/techarts-festival-2013-schedule]]></url>        <title><![CDATA[TechArts Festival 2013]]></title>      </link>          <link>        <url><![CDATA[https://www.facebook.com/events/437936852954665/]]></url>        <title><![CDATA[Automaton Facebook page]]></title>      </link>          <link>        <url><![CDATA[http://clough.gatech.edu/]]></url>        <title><![CDATA[Clough Commons]]></title>      </link>          <link>        <url><![CDATA[http://www.coe.gatech.edu/]]></url>        <title><![CDATA[College of Engineering]]></title>      </link>          <link>        <url><![CDATA[http://www.youtube.com/watch?feature=youtu.be&amp;v=_6LqL3S4lDk]]></url>        <title><![CDATA[Automaton - Video]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="63011"><![CDATA[Amy LaViers]]></keyword>          <keyword tid="594"><![CDATA[college of engineering]]></keyword>          <keyword tid="4251"><![CDATA[dance]]></keyword>          <keyword tid="59441"><![CDATA[GRITS Lab]]></keyword>          <keyword tid="11528"><![CDATA[Magnus Egerstedt]]></keyword>          <keyword tid="63021"><![CDATA[Nao]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="166855"><![CDATA[School of Electrical and Computer Engineering]]></keyword>          <keyword tid="167979"><![CDATA[Style]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="201371">  <title><![CDATA["Terradynamics" Could Help Designers Predict How Legged Robots Will Move on Granular Media]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Using a combination of theory and experiment, researchers have developed a new approach for understanding and predicting how small legged robots – and potentially also animals – move on and interact with complex granular materials such as sand.</p><p>The research could help create and advance the field of “terradynamics” – a name the researchers have given to the science of legged animals and vehicles moving on granular and other complex surfaces. Providing equations to describe and predict this type of movement – comparable to what has been done to predict the motion of animals and vehicles through the air or water – could allow designers to optimize legged robots operating in complex environments for search-and-rescue missions, space exploration or other tasks.</p><p>“We now have the tools to understand the movement of legged vehicles over loose sand in the same way that scientists and engineers have had tools to understand aerodynamics and hydrodynamics,” said Daniel Goldman, a professor in the School of Physics at the Georgia Institute of Technology. “We are at the beginning of tools that will allow us to do the design and simulation of legged robots to not only predict their performance, but also to optimize designs and allow us to create new concepts.”</p><p>The research behind “terradynamics” was described in the March 22 issue of the journal <em>Science</em>. The research was supported by the National Science Foundation Physics of Living Systems program, the Army Research Office, the Army Research Laboratory, the Burroughs Wellcome Fund and the Miller Institute for Basic Research in Science of the University of California, Berkeley.</p><p>Robots such as the Mars Rover have depended on wheels for moving in complex environments such as sand and rocky terrain. Robots envisioned for autonomous search-and-rescue missions also rely on wheels, but as the vehicles become smaller, designers may need to examine alternative means of locomotion, Goldman said.</p><p>Existing techniques for describing locomotion on surfaces are complex and can’t take into account the intrusion of legs into a granular surface. To improve and simplify the understanding, Goldman and collaborators Chen Li and Tingnan Zhang examined the motion of a small legged robot as it moved on granular media. Using a 3-D printer, they created legs in a variety of shapes and used them to study how different configurations affected the robot’s speed along a track bed. They then measured granular force laws from experiments to predict forces on legs, and created simulation to predict the robot’s motion.</p><p>The key insight, according to Goldman, was that the forces applied to independent elements of the robot legs could be simply summed together to provide a reasonably accurate measure of the net force on a robot moving through granular media. That technique, known as linear superposition, worked surprisingly well for legs moving in diverse kinds of granular media.</p><p>“We discovered that the force laws affecting this motion are generic in a diversity of granular media, including poppy seeds, glass beads and natural sand,” said Li, who is now a Miller postdoctoral fellow at the University of California at Berkeley. “Based on this generalization, we developed a practical procedure for non-specialists to easily apply terradynamics in their own studies using just a single force measurement made with simple equipment they can buy off the shelf, such as a penetrometer.”</p><p>For more complicated granular materials, although the terradynamics approach still worked well, an additional factor – perhaps the degree to which particles resemble a sphere – may be required to describe the forces with equivalent accuracy.</p><p>Beyond understanding the basic physics principles involved, the researchers also learned that convex legs made in the shape of the letter “C” worked better than other variations.</p><p>“As long as the legs are convex, the robot generates large lift and small body drag, and thus can run fast,” Goldman said. “When the limb shape was changed to flat or concave, the performance dropped. This information is important for optimizing the energy efficiency of legged robots.”</p><p>Aerodynamic designers have long used a series of equations known as Navier-Stokes to describe the movement of vehicles through the air. Similarly, these equations also allow hydrodynamics designers to know how submarines and other vehicles move through water.</p><p>“Terradynamics” could provide designers with an efficient technique for understanding motion through media that flows around legs of terrestrial animals and robots.</p><p>“Using terradynamics, our simulation is not only as accurate as the established discrete element method (DEM) simulation, but also much more computationally efficient,” said Zhang, who is a graduate student in Goldman’s laboratory. “For example, to simulate one second of robot locomotion on a granular bed of five million poppy seeds takes the DEM simulation a month using computers in our lab. Using terradynamics, the simulation takes only 10 seconds.”</p><p>The six-legged experimental robot was just 13 centimeters long and weighed about 150 grams. Robots of that size could be used in the future for search-and-rescue missions, or to scout out unknown environments such as the surface of Mars. They could also provide biologists with a better understanding of how animals such as sand lizards run and kangaroo rats hop on granular media.</p><p>“From a biological perspective, this opens up a new area,” said Goldman, who has studied a variety of animals to learn how their locomotion may assist robot designers. “These are the kinds of tools that can help understand why lizards have feet and bodies of certain shapes. The problems associated with movement in sandy environments are as important to many animals as they are to robots.”</p><p>Beyond optimizing the design of future small robots, the work could also lead to a better understanding of the complex environment through which they will have to move.</p><p>“We think that the kind of approach we are taking allows us to ask questions about the physics of granular materials that no one has asked before,” Goldman added. “This may reveal new features of granular materials to help us create more comprehensive models and theories of motion. We are now beginning to get the rules of how vehicles move through these materials.”</p><p><em>This research was supported by the Burroughs Wellcome Fund, the Army Research Laboratory Micro Autonomous Systems and Technology Collaborative Technology Alliance (CTA W911NF-08-2-004), the Army Research Office (W911NF-11-1-0514), the National Science Foundation (NSF) Physics of Living Systems program (PHY-1150760) and the Miller Institute for Basic Research in Science at the University of California, Berkeley. Any conclusions are those of the principal investigators, and do not necessarily represent the official position of the Army Research Laboratory, the Army Research Office or the NSF.</em></p><p><strong>CITATION</strong>: Chen Li, Tingnan Zhang, Daniel I. Goldman. “A Terradynamics of Legged Locomotion on Granular Media,” Science (2013): <a href="http://dx.doi.org/10.1126/science.1229163" title="http://dx.doi.org/10.1126/science.1229163">http://dx.doi.org/10.1126/science.1229163</a>.<br /><br /><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia&nbsp; 30332-0181</strong><br /><br /><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1363872017</created>  <gmt_created>2013-03-21 13:20:17</gmt_created>  <changed>1475896435</changed>  <gmt_changed>2016-10-08 03:13:55</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have developed a new technique for predicting how robots will move on granular media.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have developed a new technique for predicting how robots will move on granular media.]]></sentence>  <summary><![CDATA[<p>Using a combination of theory and experiment, researchers have developed a new approach for understanding and predicting how small legged robots – and potentially also animals – move on and interact with complex granular materials such as sand.</p>]]></summary>  <dateline>2013-03-21T00:00:00-04:00</dateline>  <iso_dateline>2013-03-21T00:00:00-04:00</iso_dateline>  <gmt_dateline>2013-03-21 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>201321</item>          <item>201311</item>          <item>201331</item>          <item>201341</item>      </media>  <hg_media>          <item>          <nid>201321</nid>          <type>image</type>          <title><![CDATA[Terradynamics robots running]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[terradynamics111.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/terradynamics111_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/terradynamics111_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/terradynamics111_0.jpg?itok=J8aoEKq1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Terradynamics robots running]]></image_alt>                    <created>1449179943</created>          <gmt_created>2015-12-03 21:59:03</gmt_created>          <changed>1475894856</changed>          <gmt_changed>2016-10-08 02:47:36</gmt_changed>      </item>          <item>          <nid>201311</nid>          <type>image</type>          <title><![CDATA[Terradynamics experimental data]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[terradynamics82.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/terradynamics82_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/terradynamics82_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/terradynamics82_0.jpg?itok=l5yW0bW8]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Terradynamics experimental data]]></image_alt>                    <created>1449179943</created>          <gmt_created>2015-12-03 21:59:03</gmt_created>          <changed>1475894856</changed>          <gmt_changed>2016-10-08 02:47:36</gmt_changed>      </item>          <item>          <nid>201331</nid>          <type>image</type>          <title><![CDATA[terradynamics force testing]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[terradynamics247.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/terradynamics247_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/terradynamics247_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/terradynamics247_0.jpg?itok=1gPWNEwa]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[terradynamics force testing]]></image_alt>                    <created>1449179943</created>          <gmt_created>2015-12-03 21:59:03</gmt_created>          <changed>1475894856</changed>          <gmt_changed>2016-10-08 02:47:36</gmt_changed>      </item>          <item>          <nid>201341</nid>          <type>image</type>          <title><![CDATA[Terradyamics simulated robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[robotsimulation_mars03.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/robotsimulation_mars03_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/robotsimulation_mars03_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/robotsimulation_mars03_0.jpg?itok=65n5Yv5k]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Terradyamics simulated robot]]></image_alt>                    <created>1449179943</created>          <gmt_created>2015-12-03 21:59:03</gmt_created>          <changed>1475894856</changed>          <gmt_changed>2016-10-08 02:47:36</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="12040"><![CDATA[Daniel Goldman]]></keyword>          <keyword tid="62231"><![CDATA[granular media]]></keyword>          <keyword tid="62251"><![CDATA[legged robot]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="169242"><![CDATA[sand]]></keyword>          <keyword tid="166937"><![CDATA[School of Physics]]></keyword>          <keyword tid="62221"><![CDATA[terradynamics]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="200741">  <title><![CDATA[Robots to Spur Economy, Improve Quality of Life, Keep Responders Safe]]></title>  <uid>27462</uid>  <body><![CDATA[<p>Robots are being used more widely than expected in a variety of sectors, and the trend is likely to continue with robotics becoming as ubiquitous as computer technology over the next 15 years.&nbsp;</p><p>That is the message Henrik Christensen, Georgia Tech’s KUKA Chair of Robotics in the College of Computing, will bring to the Congressional Robotics Caucus on March 20 as he presents “<a href="http://robotics-vo.us/sites/default/files/2013%20Robotics%20Roadmap-rs.pdf">A Roadmap for U.S. Robotics: From Internet to Robotics - 2013 Edition</a>.”</p><p>The report, which outlines the progress of robots in multiple industries over the last five years and identifies goals for the coming decade, highlights robotics as a key economic enabler with the potential to transform U.S. society.</p><p>“Robots have the potential to bring manufacturing jobs back to the U.S., to improve our quality of life and to make sure our first responders and warfighters stay safe,” said Christensen, who is also the coordinator of Robotics Virtual Organization (VO), sponsor of the report. “We need to address the technical and educational needs so we can continue to be leaders in developing and using robotic technology.”</p><p>A group of more than 160 experts from universities, industry and government came together for five workshops over the last year to fully evaluate the use of robotics across various applications and create a roadmap to the future. Christensen is presenting that report to lawmakers as a guide on how to allocate resources to maximize progress.</p><p>Most notably, the group found using robots in manufacturing could help generate production systems that are economically competitive to outsourcing to countries with lower wages.</p><p>Companies such as Apple, Lenovo, Samsung and Foxconn already have begun to “reshore” manufacturing by using robotics in production systems. The sale of robotics in manufacturing grew by 44 percent in 2011 as robots have become cheaper and safer. The use of robots is shifting from big companies such as General Motors, Ford, Boeing and Lockheed Martin to small and medium-sized enterprises to enable burst manufacturing for one-off products, the report found.</p><p>Christensen notes that automation in manufacturing will not lead to job losses for U.S. workers, but will create new high-value jobs. &nbsp;</p><p>“Some jobs will be eliminated, but they are the ‘dirty, dull and dangerous’ jobs,” Christensen said. “Those jobs will be replaced with skilled labor positions. That’s why one of the goals in the roadmap is to educate the workforce.”</p><p>In addition to manufacturing, robots are helping businesses such as Amazon improve logistics and reduce delivery costs, a savings that could be passed on to the consumer. In agriculture, robots are being used to precisely deliver pesticide onto crops, reducing unnecessary exposure of chemicals on produce. The report recommends continued progress in both areas.</p><p>With advances in human-like manipulation, robots are increasingly assisting individuals with disabilities with tasks such as getting out and preparing meals. They are also being used in 40 percent more medical procedures than a few years ago and in a greater number of surgical areas such as cardiothoracic, gynecology, urology, orthopedics and neurology. The use of robots for surgery can reduce complications by 80 percent, the report found.</p><p>Robots have proven their value in removing first-responders and soldiers from immediate danger. More than 25,000 robotic systems were deployed in Iraq and Afghanistan for ground and aerial missions. More than 50 percent of pilots in the U.S. Air Force operate remotely piloted systems and never leave the ground.</p><p>Also robots are becoming an integral part of space exploration, such as the Opportunity and Curiosity on Mars rovers. A “robonaut” is on the International Space Station helping with menial but important research tasks.</p><p>As impressive as the progress in robotics has been, the report outlines five-, 10- and 15-year goals to take robotics to the next level. Critical capabilities that should be developed for robotics include 3-D perception, intuitive human-robot interaction and safe robot behavior.</p><p>The report is an update of the initial robotics roadmap, which was published and presented to Congress in May 2009. That roadmap led to the creation of the National Robotics Initiative, an effort jointly sponsored by the National Science Foundation, the U.S. Department of Agriculture, the National Aeronautics and Space Administration and the National Institutes of Health. It also established Robotics VO, a community networking platform that brings all robotics players together to focus on joint initiatives including research, STEM outreach and technology transfer.</p><p>“Robotics is one of a few technologies capable of building new companies, creating new jobs and addressing a number of issues of national importance,” said Christensen. “We hope this report will help foster the discussion on how we can build partnerships and allocate resources to move the robotics industry forward.”&nbsp;</p>]]></body>  <author>Liz Klipp</author>  <status>1</status>  <created>1363768260</created>  <gmt_created>2013-03-20 08:31:00</gmt_created>  <changed>1475896431</changed>  <gmt_changed>2016-10-08 03:13:51</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Henrik Christensen, Georgia Tech’s KUKA Chair of Robotics, presents “A Roadmap for U.S. Robotics: From Internet to Robotics - 2013 Edition” to Congress.]]></teaser>  <type>news</type>  <sentence><![CDATA[Henrik Christensen, Georgia Tech’s KUKA Chair of Robotics, presents “A Roadmap for U.S. Robotics: From Internet to Robotics - 2013 Edition” to Congress.]]></sentence>  <summary><![CDATA[<p>Robots are being used more widely than expected in a variety of sectors, and the trend is likely to continue with robotics becoming as ubiquitous as computer technology over the next 15 years, according to the new report.</p>]]></summary>  <dateline>2013-03-20T00:00:00-04:00</dateline>  <iso_dateline>2013-03-20T00:00:00-04:00</iso_dateline>  <gmt_dateline>2013-03-20 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[klipp@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Georgia Tech Media Relations</strong><br />Laura Diamond<br /><a href="mailto:laura.diamond@comm.gatech.edu">laura.diamond@comm.gatech.edu</a><br />404-894-6016<br />Jason Maderer<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-660-2926</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>200761</item>      </media>  <hg_media>          <item>          <nid>200761</nid>          <type>image</type>          <title><![CDATA[Henrik Christensen, KUKA Chair of Robotics]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[10p1000-p71-032.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/10p1000-p71-032_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/10p1000-p71-032_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/10p1000-p71-032_1.jpg?itok=h9WAtXzh]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Henrik Christensen, KUKA Chair of Robotics]]></image_alt>                    <created>1449179943</created>          <gmt_created>2015-12-03 21:59:03</gmt_created>          <changed>1475894853</changed>          <gmt_changed>2016-10-08 02:47:33</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.robotics-vo.us/node/332]]></url>        <title><![CDATA[Robotics VO]]></title>      </link>          <link>        <url><![CDATA[http://www.roboticscaucus.org/members/default.asp]]></url>        <title><![CDATA[Congressional Robotics Caucus]]></title>      </link>          <link>        <url><![CDATA[http://robotics-vo.us/sites/default/files/2013%20Robotics%20Roadmap-rs.pdf]]></url>        <title><![CDATA[A Roadmap for U.S. Robotics: From Internet to Robotics - 2013 Edition]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/people/henrik-christensen]]></url>        <title><![CDATA[Henrik Christensen, Georgia Tech's KUKA Chair of Robotics]]></title>      </link>          <link>        <url><![CDATA[http://www.whitehouse.gov/blog/2013/03/20/road-cutting-edge-robots]]></url>        <title><![CDATA[White House Office of Science and Technology Policy blog post]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="155"><![CDATA[Congressional Testimony]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="155"><![CDATA[Congressional Testimony]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="346"><![CDATA[congress]]></keyword>          <keyword tid="11890"><![CDATA[henrik christensen]]></keyword>          <keyword tid="12239"><![CDATA[RIM]]></keyword>          <keyword tid="62031"><![CDATA[Robotics Roadmap]]></keyword>          <keyword tid="62041"><![CDATA[Robotics VO]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="172391">  <title><![CDATA[Georgia Tech Launches Manufacturing Institute]]></title>  <uid>27462</uid>  <body><![CDATA[<p>To support a new industry-friendly research strategy, the Georgia Institute of Technology announces the launch of an interdisciplinary research institute to promote a technologically advanced and globally competitive manufacturing base in the United States.</p><p>The Georgia Tech Manufacturing Institute (GTMI) creates a campus-wide community of investigators and thought leaders capable of using innovation in manufacturing to create more high-value jobs in the U.S., ensure the nation’s global competitiveness and advance economic and environmental sustainability.</p><p>“Manufacturing is important to the development of a variety of products, from medical devices to alternative energy solutions to cars, on the large and nano scale,” said Ben Wang, Georgia Tech’s chief manufacturing officer and executive director of the Georgia Tech Manufacturing Institute. “It’s critical to the economic viability and competitiveness of our nation to efficiently move leading-edge research from the lab to the real world.”</p><p>Since Georgia Tech was founded in 1888, manufacturing has been ingrained in the curriculum. Also for the last 20 years, the Georgia Tech Manufacturing Research Center has been focusing on developing next-generation technologies.</p><p>Under this new initiative, the Manufacturing Research Center has been renamed the Georgia Tech Manufacturing Institute and has expanded to engage researchers from all of Georgia Tech’s colleges, the Enterprise Innovation Institute (EI²) and the Georgia Tech Research Institute. The researchers have joined forces with industry and government experts to help define and solve some of the greatest challenges facing the manufacturing industry today, such as the importance of translational research.</p><p>“We aspire to be known globally as the collaborative hub for manufacturing technologies and as the recognized leader in crossing the ‘valley of death,’” Wang said. &nbsp;“By that, we mean to transform the research results by faculty and students into competitive products and services to be made in the U.S. Our success is defined by how fast we can translate these discoveries and innovations into products for our stakeholders, accelerating our readiness and providing translational leadership.”</p><p>GTMI will focus on the complete innovation value chain – from raw and recycled resources to prototypes and finished products. It will develop materials, systems, processes, educational offerings and policies that impact manufacturers’ performance in the marketplace.</p><p>“GTMI is industry-focused and customer-centric, amplifying Georgia Tech’s reputation globally as the world’s leader in innovation-driven manufacturing,” Wang said.</p><p>With roughly 400,000 square feet of space and state-of-the art core facilities for manufacturing research, GTMI will target specific industry needs in manufacturing by forming “collaboratories” – co-located pilot plants or prototype shops where Georgia Tech scientists and engineers work side-by-side with their counterparts from industry, government and other universities.</p><p>“By implementing best practices to develop outward-facing, collaboration-based programs of the highest impact, we are focusing on understanding and achieving the value propositions of all stakeholders to better define and deliver offerings to companies, government, other universities and colleges, and non-profits,” Wang said.&nbsp; “By doing so, we will maximize U.S. global competitiveness through accelerated innovation and technology deployment.”</p><p>Education is also a priority of the new manufacturing research institute. With top-quality researchers, facilities and technological equipment, GTMI aims to educate and train the workforce of the future to investigate, collaborate and compete successfully through both its on-site programs and via collaborative, manufacturing-based instructional programs in technical colleges. In addition to providing real-world research opportunities to undergraduate and graduate students, GTMI offers a manufacturing certificate program, manufacturing scholarships and student assistantships, and it conducts Science, Technology, Engineering, and Math (STEM) outreach activities.</p><p>GTMI brings together many of Georgia Tech’s world-class innovation activities including:</p><ul><li><a href="http://ddm.me.gatech.edu/"><strong>Additive Manufacturing</strong></a>: Using innovative direct digital manufacturing to improve cost structure and delivery lead-time in creating mechanical parts and electronic devices.</li><li><a href="http://www.fis.marc.gatech.edu/"><strong>Factory Information Systems</strong></a>: Developing, testing and launching innovative software and technology that boosts manufacturing efficiency.</li><li><a href="http://www.mbse.gatech.edu/"><strong>Model-based Systems Engineering</strong></a>: Applying software and electronics innovations to create analytic models that predict system performance, optimize system parameters and create knowledge repositories for future systems development.</li><li><strong>Policy</strong>: Understanding industry needs and promoting supportive policy to ensure the strength and viability of U.S. manufacturing competitiveness in the global marketplace. Using a multi-scale, multi-disciplinary approach enables Georgia Tech experts to see beyond traditional boundaries and to better understand where policy interventions can develop, support and sustain a resilient manufacturing base.</li><li><a href="http://pmrc.marc.gatech.edu/"><strong>Precision Machining</strong></a>: Researching and applying technologies for enhanced productivity, part quality, difficult-to-machine features and machine tool utilization of precision finishing processes.</li><li><a href="http://www.scl.gatech.edu/"><strong>Supply Chain and Logistics</strong></a>: Applying scientific principles to optimize the design and integration of supply chain processes, infrastructure, technology and strategy including developing new analysis, design and management tools, and concepts and strategies.</li><li><a href="http://www.sdm.gatech.edu/"><strong>Sustainable Design</strong></a>: Developing materials, processes and systems for implementing and operationalizing sustainability.</li><li><strong>Ultra-lightweight, Energy Efficient Materials and Structures</strong>: Using rigorous experimental and modeling R&amp;D to advance and mature technology in aerospace, biomedical, defense, energy and industrial equipment.</li></ul><p>The launch of GTMI compliments Georgia Tech’s presence in the national discussion on manufacturing. Georgia Tech President G. P. “Bud” Peterson is a member of the White House’s Advanced Manufacturing Partnership steering committee and is a member of the Secretary of Commerce’s National Advisory Council on Innovation and Entrepreneurship.</p><p>The Georgia Tech Manufacturing Institute is one of several interdisciplinary research institutes at Georgia Tech that bring together a mix of researchers – spanning colleges, departments and individual labs – around a single core research area.</p>]]></body>  <author>Liz Klipp</author>  <status>1</status>  <created>1353422161</created>  <gmt_created>2012-11-20 14:36:01</gmt_created>  <changed>1475896394</changed>  <gmt_changed>2016-10-08 03:13:14</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The interdisciplinary research institute will help promote a technologically advanced and globally competitive manufacturing base in the U.S.]]></teaser>  <type>news</type>  <sentence><![CDATA[The interdisciplinary research institute will help promote a technologically advanced and globally competitive manufacturing base in the U.S.]]></sentence>  <summary><![CDATA[<p>To support a new industry-friendly research strategy, the Georgia Institute of Technology announces the launch of an interdisciplinary research institute to promote a technologically advanced and globally competitive manufacturing base in the United States.</p>]]></summary>  <dateline>2012-11-20T00:00:00-05:00</dateline>  <iso_dateline>2012-11-20T00:00:00-05:00</iso_dateline>  <gmt_dateline>2012-11-20 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[klipp@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Georgia Tech Media Relations</strong><br />Laura Diamond<br /><a href="mailto:laura.diamond@comm.gatech.edu">laura.diamond@comm.gatech.edu</a><br />404-894-6016<br />Jason Maderer<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-660-2926</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>172701</item>          <item>70794</item>      </media>  <hg_media>          <item>          <nid>172701</nid>          <type>image</type>          <title><![CDATA[Georgia Tech Manufacturing Institute]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[13c3000-p1-126.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/13c3000-p1-126_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/13c3000-p1-126_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/13c3000-p1-126_0.jpg?itok=OZbxIyQ5]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Georgia Tech Manufacturing Institute]]></image_alt>                    <created>1449178999</created>          <gmt_created>2015-12-03 21:43:19</gmt_created>          <changed>1475894814</changed>          <gmt_changed>2016-10-08 02:46:54</gmt_changed>      </item>          <item>          <nid>70794</nid>          <type>image</type>          <title><![CDATA[Ben Wang]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[meyer_20110630_1750.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/meyer_20110630_1750_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/meyer_20110630_1750_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/meyer_20110630_1750_0.jpg?itok=Q9tcy_vK]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ben Wang]]></image_alt>                    <created>1449177314</created>          <gmt_created>2015-12-03 21:15:14</gmt_created>          <changed>1475894623</changed>          <gmt_changed>2016-10-08 02:43:43</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.manufacturing.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech Manufacturing Institute]]></title>      </link>          <link>        <url><![CDATA[http://www.gatech.edu/research/]]></url>        <title><![CDATA[Georgia Tech Office of Research & Graduate Studies]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="139"><![CDATA[Business]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="139"><![CDATA[Business]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="51021"><![CDATA[Georgia Tech Manufacturing Institute; Ben Wang; Interdisciplinary research institute]]></keyword>          <keyword tid="51031"><![CDATA[research strategy]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="160721">  <title><![CDATA[Robots Using Tools: With New Grant, Researchers Aim to Create ‘MacGyver’ Robot]]></title>  <uid>27560</uid>  <body><![CDATA[<p>Robots are increasingly being used in place of humans to explore hazardous and difficult-to-access environments, but they aren’t yet able to interact with their environments as well as humans. If today’s most sophisticated robot was trapped in a burning room by a jammed door, it would probably not know how to locate and use objects in the room to climb over any debris, pry open the door, and escape the building.</p><p>A research team led by Professor Mike Stilman at the Georgia Institute of Technology hopes to change that by giving robots the ability to use objects in their environments to accomplish high-level tasks. The team recently received a three-year, $900,000 grant from the Office of Naval Research to work on this project.</p><p>“Our goal is to develop a robot that behaves like MacGyver, the television character from the 1980s who solved complex problems and escaped dangerous situations by using everyday objects and materials he found at hand,” said Stilman, an assistant professor in the School of Interactive Computing at Georgia Tech. “We want to understand the basic cognitive processes that allow humans to take advantage of arbitrary objects in their environments as tools. We will achieve this by designing algorithms for robots that make tasks that are impossible for a robot alone possible for a robot with tools.”</p><p>The research will build on Stilman’s previous work on navigation among movable obstacles that enabled robots to autonomously recognize and move obstacles that were in the way of their getting from point A to point B.</p><p>“This project is challenging because there is a critical difference between moving objects out of the way and using objects to make a way,” explained Stilman. “Researchers in the robot motion planning field have traditionally used computerized vision systems to locate objects in a cluttered environment to plan collision-free paths, but these systems have not provided any information about the objects’ functions.”</p><p>To create a robot capable of using objects in its environment to accomplish a task, Stilman plans to develop an algorithm that will allow a robot to identify an arbitrary object in a room, determine the object’s potential function, and turn that object into a simple machine that can be used to complete an action. Actions could include using a chair to reach something high, bracing a ladder against a bookshelf, stacking boxes to climb over something, and building levers or bridges from random debris.</p><p>By providing the robot with basic knowledge of rigid body mechanics and simple machines, the robot should be able to autonomously determine the mechanical force properties of an object and construct motion plans for using the object to perform high-level tasks.</p><p>For example, exiting a burning room with a jammed door would require a robot to travel around any fire, use an object in the room to apply sufficient force to open the stuck door, and locate an object in the room that will support its weight while it moves to get out of the room.</p><p>Such skills could be extremely valuable in the future as robots work side-by-side with military personnel to accomplish challenging missions.</p><p>“The Navy prides itself on recruiting, training and deploying our country’s most resourceful and intelligent men and women,” said Paul Bello, director of the cognitive science program in the Office of Naval Research (ONR). “Now that robotic systems are becoming more pervasive as teammates for warfighters in military operations, we must ensure that they are both intelligent and resourceful. Professor Stilman’s work on the ‘MacGyver-bot’ is the first of its kind, and is already beginning to deliver on the promise of mechanical teammates able to creatively perform in high-stakes situations.”</p><p>To address the complexity of the human-like reasoning required for this type of scenario, Stilman is collaborating with researchers Pat Langley and Dongkyu Choi. Langley is the director of the Institute for the Study of Learning and Expertise (ISLE), and is recognized as a co-founder of the field of machine learning, where he championed both experimental studies of learning algorithms and their application to real-world problems. Choi is an assistant professor in the Department of Aerospace Engineering at the University of Kansas.</p><p>Langley and Choi will expand the cognitive architecture they developed, called ICARUS, which provides an infrastructure for modeling various human capabilities like perception, inference, performance and learning in robots.</p><p>“We believe a hybrid reasoning system that embeds our physics-based algorithms within a cognitive architecture will create a more general, efficient and structured control system for our robot that will accrue more benefits than if we used one approach alone,” said Stilman.</p><p>After the researchers develop and optimize the hybrid reasoning system using computer simulations, they plan to test the software using Golem Krang, a humanoid robot designed and built in Stilman’s laboratory to study whole-body robotic planning and control.</p><p>&nbsp;<em>This research is sponsored by the Department of the Navy, Office of Naval Research, through grant number N00014-12-1-0143. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the Office of Naval Research.</em></p><p>&nbsp;</p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1349770940</created>  <gmt_created>2012-10-09 08:22:20</gmt_created>  <changed>1475896378</changed>  <gmt_changed>2016-10-08 03:12:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[New project is designed to teach robots how to use objects in the environment to accomplish high-level tasks]]></teaser>  <type>news</type>  <sentence><![CDATA[New project is designed to teach robots how to use objects in the environment to accomplish high-level tasks]]></sentence>  <summary><![CDATA[<p>A Georgia Tech research team has received a grant from the Office of Naval Research to work on a project that intends to teach robots how to use objects in their environment to accomplish high-level tasks.</p>]]></summary>  <dateline>2012-10-09T00:00:00-04:00</dateline>  <iso_dateline>2012-10-09T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-10-09 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon<br />Research News &amp; Publications Office<br /> <a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a> <br /> 404-894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>160691</item>          <item>160701</item>          <item>160711</item>      </media>  <hg_media>          <item>          <nid>160691</nid>          <type>image</type>          <title><![CDATA[MacGyver Grant, Photo 1]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[macgyver-1-cropped_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/macgyver-1-cropped_0_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/macgyver-1-cropped_0_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/macgyver-1-cropped_0_0.jpg?itok=sKGUmi4R]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MacGyver Grant, Photo 1]]></image_alt>                    <created>1449178896</created>          <gmt_created>2015-12-03 21:41:36</gmt_created>          <changed>1475894796</changed>          <gmt_changed>2016-10-08 02:46:36</gmt_changed>      </item>          <item>          <nid>160701</nid>          <type>image</type>          <title><![CDATA[MacGyver Grant, Photo 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[macgyver-robot-9680.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/macgyver-robot-9680_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/macgyver-robot-9680_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/macgyver-robot-9680_0.jpg?itok=x0f2jlXB]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MacGyver Grant, Photo 2]]></image_alt>                    <created>1449178896</created>          <gmt_created>2015-12-03 21:41:36</gmt_created>          <changed>1475894796</changed>          <gmt_changed>2016-10-08 02:46:36</gmt_changed>      </item>          <item>          <nid>160711</nid>          <type>image</type>          <title><![CDATA[MacGyver Grant, Photo 3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[macgyver-robot-9651.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/macgyver-robot-9651_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/macgyver-robot-9651_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/macgyver-robot-9651_0.jpg?itok=3xFB1YPs]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MacGyver Grant, Photo 3]]></image_alt>                    <created>1449178896</created>          <gmt_created>2015-12-03 21:41:36</gmt_created>          <changed>1475894796</changed>          <gmt_changed>2016-10-08 02:46:36</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.cc.gatech.edu/~mstilman/]]></url>        <title><![CDATA[Mike Stillman Website]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/]]></url>        <title><![CDATA[College of Computing]]></title>      </link>          <link>        <url><![CDATA[http://www.ic.gatech.edu/about]]></url>        <title><![CDATA[School of Interactive Computing]]></title>      </link>          <link>        <url><![CDATA[http://robotics.gatech.edu/]]></url>        <title><![CDATA[Center for Robotics & Intelligent Machines]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="45961"><![CDATA[Golem Krang]]></keyword>          <keyword tid="45951"><![CDATA[MacGyver]]></keyword>          <keyword tid="11527"><![CDATA[Mike Stillman]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="156851">  <title><![CDATA[Easy Guider: Intuitive Visual Control Provides Faster Remote Operation of Robots]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Using a novel method of integrating video technology and familiar control devices, a research team from the Georgia Institute of Technology is developing a technique to simplify remote control of robotic devices.</p><p>The researchers' aim is to enhance a human operator's ability to perform precise tasks using a multi-jointed robotic device such as an articulated mechanical arm. The new approach has been shown to be easier and faster than older methods, especially when the robot is controlled by an operator who is watching it in a video monitor.&nbsp;</p><p>Known as Uncalibrated Visual Servoing for Intuitive Human Guidance of Robots, the new method uses a special implementation of an existing vision-guided control method called visual servoing (VS). By applying visual-servoing technology in innovative ways, the researchers have constructed a robotic system that responds to human commands more directly and intuitively than older techniques.</p><p>"Our approach exploits 3-D video technology to let an operator guide a robotic device in ways that are more natural and time-saving, yet are still very precise," said Ai-Ping Hu, a senior research engineer with the Georgia Tech Research Institute (GTRI). "This capability could have numerous applications – especially in situations where directly observing the robot's operation is hazardous or not possible – including bomb disposal, handling of hazardous materials and search-and-rescue missions."</p><p>A paper on this technology was presented at the 2012 IEEE International Conference on Robotics and Automation held in St. Paul, Minn.</p><p>For decades articulated robots have been used by industry to perform precision tasks such as welding vehicle seams or assembling electronics, Hu explained. The user develops a software program that enables the device to cycle through the required series of motions, using feedback from sensors built into the robot.</p><p>But such programming can be complex and time-consuming. The robot must typically be maneuvered joint by joint through the numerous actions required to complete a task. Moreover, such technology works only in a structured and unchanging environment, such as a factory assembly line, where spatial relationships are constant.</p><p><strong>The Human Operator</strong></p><p>In recent years, new techniques have enabled human operators to freely guide remote robots through unstructured and unfamiliar environments, to perform such challenging tasks as bomb disposal, Hu said. Operators have controlled the device in one of two ways: by "line of sight" – direct user observation – or by means of conventional, two-dimensional camera that is mounted on the robot to send back an image of both the robot and its target.&nbsp;&nbsp;</p><p>But humans guiding robots via either method face some of the same complexities that challenge those who program industrial robots, he added. Manipulating a remote robot into place is generally slow and laborious.</p><p>That's especially true when the operator must depend on the imprecise images provided by 2-D video feedback. Manipulating separate controls for each of the robot's multiple joint axes, users have only limited visual information to help them and must maneuver to the target by trial and error.</p><p>"Essentially, the user is trying to visualize and reconstruct a 3-D scenario from flat 2-D camera images," Hu said. "The process can become particularly confusing when operators are facing in a different direction from the robot and must mentally reorient themselves to try to distinguish right from left. It's somewhat similar to backing up a vehicle with an attached trailer – you have to turn the steering wheel to the left to get the trailer to move right, which is decidedly non-intuitive."</p><p><strong>The Visual Servoing Advantage</strong></p><p>To simplify user control, the Georgia Tech team turned to visual servoing (a term synonymous with visual activation).&nbsp; Visual servoing has been studied for years as a way to use video cameras to help robots re-orient themselves within a structured environment such as an assembly line.&nbsp;</p><p>Traditional visual servoing is calibrated, meaning that position information generated by a video camera can be transformed into data meaningful to the robot. Using these data, the robot can adjust itself to stay in a correct spatial relationship with target objects.&nbsp;&nbsp;&nbsp;</p><p>"Say a conveyor line is accidently moved a few millimeters," Hu said. "A robot with a calibrated visual servoing capability can automatically detect the movement using the video image and a fixed reference point, and then readjust to compensate."</p><p>But visual servoing offers additional possibilities. The research team – which includes Hu, associate professor Harvey Lipkin of the School of Mechanical Engineering, graduate student Matthew Marshall, GTRI research engineer Michael Matthews and GTRI principal research engineer Gary McMurray -- has adapted visual-servoing technology in ways that facilitate human control of remote robots.&nbsp;</p><p>The new technique takes advantage of both calibrated and uncalibrated techniques.&nbsp; A calibrated 3-D "time of flight" camera is mounted on the robot – typically at the end of a robotic arm, in a gripping device called an end-effector. This approach is sometimes called an eye-in-hand system, because of the camera's location in the robot's "hand."</p><p>The camera utilizes an active sensor that detects depth data, allowing it to send back 3-D coordinates that pinpoint the end-effector's spatial location.&nbsp; At the same time, the eye-in-hand camera also supplies a standard, uncalibrated 2-D grayscale video image to the operator's monitor.</p><p>The result is that the operator, without seeing the robot, now has a robot's-eye view of the target. Watching this image in a monitor, an operator can visually guide the robot using a gamepad, in a manner somewhat reminiscent of a first-person 3-D video game.&nbsp;</p><p>In addition, visual-servoing technology now automatically actuates all the joints needed to complete whatever action the user indicates on the gamepad – rather than the user having to manipulate those joints one by one. In the background, the Georgia Tech system performs the complex computation needed to coordinate the monitor image, the 3-D camera information, the robot's spatial position and the user's gamepad commands.</p><p><strong>Testing System Usability</strong></p><p>"The guidance process is now very intuitive – pressing 'left' on the gamepad will actuate all the requisite robot joints to effect a leftward displacement," Hu said. "What's more, the robot could be upside down and the controls will still respond in the same intuitive way – left is still left and right is still right."</p><p>To judge system usability, the Georgia Tech research team recently conducted trials to test whether the visual-servoing approach enabled faster task-completion times. Using a gamepad that controls an articulated-arm robot with six degrees of freedom, subjects performed four tests: they used visual-servoing guidance as well as conventional joint-based guidance, in both line-of-sight and camera-view modes.</p><p>In the line-of-sight test, volunteer participants using visual-servoing guidance averaged task-completion times that were 15 percent faster than when they used joint-based guidance. However, in camera-view mode, participants using visual-servoing guidance averaged 227 percent faster results than with the joint-based technique.</p><p>Hu noted that the visual-servoing system used in this test scenario was only one of numerous possible applications of the technology.&nbsp; The research team's plans include testing a mobile platform with a VS-guided robotic arm mounted on it. Also underway is a proof-of-concept effort that incorporates visual-servoing control into a low-cost, consumer-level robot.</p><p>"Our ultimate goal is to develop a generic, uncalibrated control framework that is able to use image data to guide many different kinds of robots," he said.&nbsp;&nbsp;&nbsp;&nbsp; <br />&nbsp;&nbsp;&nbsp;&nbsp; &nbsp;<br /><strong>Research News &amp; Publications Office</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>75 Fifth Street, N.W., Suite 309</strong><br /><strong>Atlanta, Georgia&nbsp; 30308&nbsp; USA</strong><br /><br /><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).</p><p><strong>Writer</strong>: Rick Robinson<br /><br /></p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1348578810</created>  <gmt_created>2012-09-25 13:13:30</gmt_created>  <changed>1475896370</changed>  <gmt_changed>2016-10-08 03:12:50</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A technique known as uncalibrated visual servoing could make the remote control of robots more intuitive.]]></teaser>  <type>news</type>  <sentence><![CDATA[A technique known as uncalibrated visual servoing could make the remote control of robots more intuitive.]]></sentence>  <summary><![CDATA[<p>Using a novel method of integrating video technology and familiar control devices, a research team from the Georgia Institute of Technology is developing a technique to simplify remote control of robotic devices.&nbsp;&nbsp;&nbsp;</p>]]></summary>  <dateline>2012-09-25T00:00:00-04:00</dateline>  <iso_dateline>2012-09-25T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-09-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News &amp; Publications Office</p><p>(404) 894-6986</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>156811</item>          <item>156821</item>          <item>156831</item>      </media>  <hg_media>          <item>          <nid>156811</nid>          <type>image</type>          <title><![CDATA[Uncalibrated Visual Servoing]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[visual-servoing64.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/visual-servoing64_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/visual-servoing64_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/visual-servoing64_0.jpg?itok=lM_Q6oqi]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Uncalibrated Visual Servoing]]></image_alt>                    <created>1449178872</created>          <gmt_created>2015-12-03 21:41:12</gmt_created>          <changed>1475894792</changed>          <gmt_changed>2016-10-08 02:46:32</gmt_changed>      </item>          <item>          <nid>156821</nid>          <type>image</type>          <title><![CDATA[Uncalibrated Visual Servoing 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[visual-servoing46.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/visual-servoing46_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/visual-servoing46_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/visual-servoing46_0.jpg?itok=rqf7x6Me]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Uncalibrated Visual Servoing 2]]></image_alt>                    <created>1449178872</created>          <gmt_created>2015-12-03 21:41:12</gmt_created>          <changed>1475894792</changed>          <gmt_changed>2016-10-08 02:46:32</gmt_changed>      </item>          <item>          <nid>156831</nid>          <type>image</type>          <title><![CDATA[Uncalibrated Visual Servoing 3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[visual-servoing126.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/visual-servoing126_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/visual-servoing126_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/visual-servoing126_1.jpg?itok=W1h90XMQ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Uncalibrated Visual Servoing 3]]></image_alt>                    <created>1449178872</created>          <gmt_created>2015-12-03 21:41:12</gmt_created>          <changed>1475894792</changed>          <gmt_changed>2016-10-08 02:46:32</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="44451"><![CDATA[Ai-Ping Hu]]></keyword>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="44461"><![CDATA[robot arm]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="44441"><![CDATA[visual servoing]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>          <term tid="39541"><![CDATA[Systems]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="144381">  <title><![CDATA[Micron-Scale Swimming Robots Could Deliver Drugs & Carry Cargo Using Simple Motion]]></title>  <uid>27303</uid>  <body><![CDATA[<p>When you’re just a few microns long, swimming can be difficult. At that size scale, the viscosity of water is more like that of honey, and momentum can’t be relied upon to maintain forward motion.</p><p>Microorganisms, of course, have evolved ways to swim in spite of these challenges, but tiny robots haven’t quite caught up. Now a team of researchers at the Georgia Institute of Technology has used complex computational models to design swimming micro-robots that could overcome these challenges to carry cargo and navigate in response to stimuli such as light.</p><p>When they’re actually built some day, these simple micro-swimmers could rely on volume changes in unique materials known as hydrogels to move tiny flaps that will propel the robots. The micro-devices could be used in drug delivery, lab-on-a-chip microfluidic systems – and even as micro-construction robots working in swarms.</p><p>The simple micro-swimmers were described July 23 in the online advance edition of the journal <em>Soft Matter</em>, published by the Royal Society of Chemistry in the United Kingdom.</p><p>“We believe that our simulations will give experimentalists a reason to pursue development of these micro-swimmers to go beyond what is available now,” said <a href="http://www.me.gatech.edu/faculty/alexeev">Alexander Alexeev</a>, an assistant professor in the <a href="http://www.me.gatech.edu/">George W. Woodruff School of Mechanical Engineering</a> at Georgia Tech. “We wanted to demonstrate the principle of how robots this small could move by determining what is important and what would need to be used to build a real system.”</p><p>The simple swimmer designed by Alexeev and collaborators Hassan Masoud and Benjamin Bingham consists of a responsive gel body about ten microns long with two propulsive flaps attached to opposite sides. A steering flap sensitive to specific stimuli would be located at the front of the swimmer.</p><p>The responsive gel body would undergo periodic expansions and contractions triggered by oscillatory chemical reactions, oscillating magnetic or electric fields, or by cycles of temperature change. These expansions and contractions – the chemical swelling and de-swelling of the material – would create a beating motion in the rigid propulsive flaps attached to each side of the micro-swimmer. Combined with the movement of the gel body, the beating motion would move the micro-swimmer forward.</p><p>The trajectory of the micro-swimmer would be controlled by a flexible steering flap on its front. The flap would be made of a material that deforms based on changes in light intensity, temperature or magnetic field.</p><p>“The combination of these flaps and the oscillating body creates a very nice motion that we believe can be used to propel the swimmer,” said Alexeev. “To build a device that is autonomous and self-propelling at the micron-scale, we cannot build a tiny submarine. We have to keep it simple.”</p><p>Key to the operation of the micro-swimmer would be the latest generation of hydrogels, materials whose volume changes in a cyclical way. The hydrogels would serve as “chemical engines” to provide the motion needed to move the device’s propulsive flaps. Such materials currently exist and are being improved upon for other applications.</p><p>“We are using the state-of-the art in materials science, changing the properties of the material,” explained Masoud, a Ph.D. candidate in the School of Mechanical Engineering. “We have combined the materials with the principles of hydrodynamics at the small scale to develop this new swimmer.”</p><p>As part of their modeling, the researchers examined the effects of flaps of different sizes and properties. They also studied how flexible the micro-swimmer’s body needed to be to produce the kind of movement needed for swimming.</p><p>“You can’t swim at the small scale in the same way you swim at the large scale,” Alexeev said. “There is no inertia, which is how you keep moving at the large scale. What happens at the small scale is counterintuitive to what you expect at the large scale.”</p><p>The computational fluid modeling the researchers used allowed them to study a wide range of parameters in materials, oscillation rates and flexibility. What they learned, Alexeev said, will give experimentalists a starting point for actually building prototypes of the flexible gel robots.</p><p>“We have captured the solid mechanics of the periodically-oscillating body, the fluid dynamics of moving through the viscous liquid, and the coupling between the two,” he said. “From a computational fluid dynamics standpoint, it’s not an easy problem to model at this scale.”</p><p>Ultimately, the researchers hope to work with an experimental team to actually build the micro-swimmers. Combining their theoretical work with actual experiments could be a powerful approach to building robots on this size scale.</p><p>“This is a simulation that we hope to see in real life one day,” Alexeev said. “We have learned how experimentalists can pursue fabrication of these devices without extensive trial-and-error. We can use the simulations to look inside what will happen by using the laws of physics to explain it.”</p><p>The researchers envision groups of micro-swimmers carrying cargo through microfluidic chips or other devices. Swarms of them could one day work together as tiny construction robots moving materials to desired locations for assembly.</p><p>But the micro-swimmers won’t win any Olympic competitions. Alexeev estimates that their top speed could be on the order of a few micrometers per second – which should be enough to accomplish their mission.</p><p>“If your body is micrometers in size, that kind of speed is really not too bad,” he said. “The swimming speed will be rather slow, but at that size scale, you don’t really need to go very fast since you only need to go short distances.”</p><p><strong>Citation</strong>: Hassan Masoud, Benjamin I. Bingham and Alexander Alexeev, Soft Matter, 2012, Advance Article. DOI: 10.1039/C2SM25898F.<br /><br /><strong>Research News &amp; Publications Office</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>75 Fifth Street, N.W., Suite 309</strong><br /><strong>Atlanta, Georgia&nbsp; 30308&nbsp; USA</strong><br /><br /><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).</p><p><strong>Writer</strong>: John Toon</p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1344205038</created>  <gmt_created>2012-08-05 22:17:18</gmt_created>  <changed>1475896356</changed>  <gmt_changed>2016-10-08 03:12:36</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Computational modeling shows how micro-swimmers could overcome the challenges of swimming at the micron scale.]]></teaser>  <type>news</type>  <sentence><![CDATA[Computational modeling shows how micro-swimmers could overcome the challenges of swimming at the micron scale.]]></sentence>  <summary><![CDATA[<p>Researchers have used complex computational models to design micro-swimmers that could overcome the challenges of swimming at the micron scale. These autonomous micro-robots could carry cargo and navigate in response to stimuli such as light.</p>]]></summary>  <dateline>2012-08-05T00:00:00-04:00</dateline>  <iso_dateline>2012-08-05T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-08-05 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News &amp; Publications Office</p><p>(404) 894-6986</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>144371</item>      </media>  <hg_media>          <item>          <nid>144371</nid>          <type>image</type>          <title><![CDATA[Image of Simulated Micro-Swimmer]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[microswimmer.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/microswimmer_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/microswimmer_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/microswimmer_0.jpg?itok=d9LiTA7Z]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Image of Simulated Micro-Swimmer]]></image_alt>                    <created>1449178739</created>          <gmt_created>2015-12-03 21:38:59</gmt_created>          <changed>1475894777</changed>          <gmt_changed>2016-10-08 02:46:17</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="39581"><![CDATA[Alexander Alexeev]]></keyword>          <keyword tid="39591"><![CDATA[computational modeling]]></keyword>          <keyword tid="3356"><![CDATA[hydrogel]]></keyword>          <keyword tid="39571"><![CDATA[micro-robot]]></keyword>          <keyword tid="39561"><![CDATA[micro-swimmer]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="167377"><![CDATA[School of Mechanical Engineering]]></keyword>      </keywords>  <core_research_areas>          <term tid="39471"><![CDATA[Materials]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="138981">  <title><![CDATA[Robot Vision: Muscle-Like Action Allows Camera to Mimic Human Eye Movement]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Using piezoelectric materials, researchers have replicated the muscle motion of the human eye to control camera systems in a way designed to improve the operation of robots. This new muscle-like action could help make robotic tools safer and more effective for MRI-guided surgery and robotic rehabilitation.</p><p>Key to the new control system is a piezoelectric cellular actuator that uses a novel biologically inspired technology that will allow a robot eye to move more like a real eye. This will be useful for research studies on human eye movement as well as making video feeds from robots more intuitive. The research is being conducted by Ph.D. candidate Joshua Schultz under the direction of assistant professor <a href="http://www.me.gatech.edu/faculty/ueda">Jun Ueda</a>, both from the <a href="http://www.me.gatech.edu/">George W. Woodruff School of Mechanical Engineering</a> at the Georgia Institute of Technology.</p><p>“For a robot to be truly bio-inspired, it should possess actuation, or motion generators, with properties in common with the musculature of biological organisms,” said Schultz. “The actuators developed in our lab embody many properties in common with biological muscle, especially a cellular structure. Essentially, in the human eye muscles are controlled by neural impulses. Eventually, the actuators we are developing will be used to capture the kinematics and performance of the human eye.”</p><p>Details of the research were presented June 25, 2012, at the IEEE International Conference on Biomedical Robotics and Biomechatronics in Rome, Italy. The research is funded by National Science Foundation. Schultz also receives partial support from the Achievement Rewards for College Scientists (ARCS) Foundation.</p><p>Ueda, who leads the Georgia Tech Bio-Robotics and Human Modeling Laboratory in the School of Mechanical Engineering, said this novel technology will lay the groundwork for investigating research questions in systems that possess a large number of active units operating together. The application ranges from industrial robots, medical and rehabilitation robots to intelligent assistive robots.</p><p>“Robustness against uncertainty of model and environment is crucial for robots physically interacting with humans and environments,” said Ueda. “Successful integration relies on the coordinated design of control, structure, actuators and sensors by considering the dynamic interaction among them.”</p><p>Piezoelectric materials expand or contract when electricity is applied to them, providing a way to transform input signals into motion. This principle is the basis for piezoelectric actuators that have been used in numerous applications, but use in robotics applications has been limited due to piezoelectric ceramic's minuscule displacement. &nbsp;</p><p>The cellular actuator concept developed by the research team was inspired by biological muscle structure that connects many small actuator units in series or in parallel.</p><p>The Georgia Tech team has developed a lightweight, high speed approach that includes a single-degree of freedom camera positioner that can be used to illustrate and understand the performance and control of biologically inspired actuator technology. This new technology uses less energy than traditional camera positioning mechanisms and is compliant for more flexibility.</p><p>“Each muscle-like actuator has a piezoelectric material and a nested hierarchical set of strain amplifying mechanisms,” said Ueda. “We are presenting a mathematical concept that can be used to predict the performance as well as select the required geometry of nested structures. We use the design of the camera positioning mechanism’s actuators to demonstrate the concepts.”</p><p>The scientists’ research shows mechanisms that can scale up the displacement of piezoelectric stacks to the range of the ocular positioning system. In the past, the piezoelectric stacks available for this purpose have been too small.</p><p>“Our research shows a two-port network model that describes compliant strain amplification mechanisms that increase the stroke length of the stacks,” said Schultz. “Our findings make a contribution to the use of piezoelectric stack devices in robotics, modeling, design and simulation of compliant mechanisms. It also advances the control of systems using a large number of motor units for a given degree of freedom and control of robotic actuators.”</p><p>In the study, the scientists sought to resolve a previous conundrum. A cable-driven eye could produce the eye’s kinematics, but rigid servomotors would not allow researchers to test the hypothesis for the neurological basis for eye motion.</p><p>Some measure of flexibility could be used in software with traditional actuators, but it depended largely on having a continuously variable control signal and it could not show how flexibility could be maintained with quantized actuation corresponding to neural recruitment phenomena.</p><p>“Each muscle-like actuator consists of a piezoelectric material and a nested hierarchical set of strain amplifying mechanisms,” said Ueda. “Unlike traditional actuators, piezoelectric cellular actuators are governed by the working principles of muscles - namely, motion results by discretely activating, or recruiting, sets of active fibers, called motor units.</p><p>“Motor units are linked by flexible tissue, which serves a two-fold function,” said Ueda. “It combines the action potential of each motor unit, and presents a compliant interface with the world, which is critical in unstructured environments.”</p><p>The Georgia Tech team has presented a camera positioner driven by a novel cellular actuator technology, using a contractile ceramic to generate motion. The team used 16 amplified piezoelectric stacks per side.</p><p>The use of multiple stacks addressed the need for more layers of amplification. The units were placed inside a rhomboidal mechanism. The work offers an analysis of the force-displacement tradeoffs involved in the actuator design and shows how to find geometry that meets the requirement of the camera positioner, said Schultz.</p><p>“The goal of scaling up piezoelectric ceramic stacks holds great potential to more accurately replicate human eye motion than previous actuators,” noted Schultz. “Future work in this area will involve implantation of this technology on a multi-degree of freedom device, applying open and closed loop control algorithms for positioning and analysis of co-contraction phenomena.”</p><p>Future research by his team will continue to focus on the development of a design framework for highly integrated robotic systems. This ranges from industrial robots to medical and rehabilitation robots to intelligent assistive robots. <br /><br /><strong>Research News &amp; Publications Office</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>75 Fifth Street, N.W., Suite 309</strong><br /><strong>Atlanta, Georgia&nbsp; 30308&nbsp; USA</strong><br /><br /><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).<br /><strong>Writer</strong>: Sarah E. Goodwin</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1341495522</created>  <gmt_created>2012-07-05 13:38:42</gmt_created>  <changed>1475896349</changed>  <gmt_changed>2016-10-08 03:12:29</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Stacks of piezoelectric actuators that simulate the action of real muscles could give robots more human-like eyes.]]></teaser>  <type>news</type>  <sentence><![CDATA[Stacks of piezoelectric actuators that simulate the action of real muscles could give robots more human-like eyes.]]></sentence>  <summary><![CDATA[<p>Using piezoelectric materials, researchers have replicated the muscle motion of the human eye to control camera systems in a way designed to improve the operation of robots. This new muscle-like action could help make robotic tools safer and more effective for MRI-guided surgery and robotic rehabilitation.</p>]]></summary>  <dateline>2012-07-05T00:00:00-04:00</dateline>  <iso_dateline>2012-07-05T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-07-05 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News &amp; Publications Office</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>138951</item>          <item>138961</item>          <item>138971</item>      </media>  <hg_media>          <item>          <nid>138951</nid>          <type>image</type>          <title><![CDATA[Piezoelectric-vision1]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[piezoelectric-vision1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/piezoelectric-vision1_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/piezoelectric-vision1_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/piezoelectric-vision1_0.jpg?itok=CsxatwbM]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezoelectric-vision1]]></image_alt>                    <created>1449178698</created>          <gmt_created>2015-12-03 21:38:18</gmt_created>          <changed>1475894769</changed>          <gmt_changed>2016-10-08 02:46:09</gmt_changed>      </item>          <item>          <nid>138961</nid>          <type>image</type>          <title><![CDATA[Piezoelectric-vision2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[piezoelectric-vision2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/piezoelectric-vision2_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/piezoelectric-vision2_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/piezoelectric-vision2_0.jpg?itok=enWm0nXW]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezoelectric-vision2]]></image_alt>                    <created>1449178698</created>          <gmt_created>2015-12-03 21:38:18</gmt_created>          <changed>1475894769</changed>          <gmt_changed>2016-10-08 02:46:09</gmt_changed>      </item>          <item>          <nid>138971</nid>          <type>image</type>          <title><![CDATA[Piezoelectric-vision4]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[piezoelectric-vision4.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/piezoelectric-vision4_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/piezoelectric-vision4_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/piezoelectric-vision4_0.jpg?itok=1rKi98zT]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezoelectric-vision4]]></image_alt>                    <created>1449178698</created>          <gmt_created>2015-12-03 21:38:18</gmt_created>          <changed>1475894769</changed>          <gmt_changed>2016-10-08 02:46:09</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="13887"><![CDATA[Jun Ueda]]></keyword>          <keyword tid="7699"><![CDATA[piezoelectric]]></keyword>          <keyword tid="37861"><![CDATA[piezoelectric actuator]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="167377"><![CDATA[School of Mechanical Engineering]]></keyword>          <keyword tid="820"><![CDATA[vision]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="133161">  <title><![CDATA[Robot Uses 3-D Imaging and Sensor-based Cutting Technology to Debone Poultry]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Researchers at the Georgia Tech Research Institute (GTRI) have developed a prototype system that uses advanced imaging technology and a robotic cutting arm to automatically debone chicken and other poultry products.</p><p>The Intelligent Cutting and Deboning System employs a 3-D vision system that determines where to cut a particular bird. The device automatically performs precision cuts that optimize yield, while also greatly reducing the risk of bone fragments in the finished product.</p><p>“Each bird is unique in its size and shape," said Gary McMurray, chief of GTRI's Food Processing Technology Division. "So we have developed the sensing and actuation needed to allow an automated deboning system to adapt to the individual bird, as opposed to forcing the bird to conform to the machine.”</p><p>Poultry is Georgia's top agricultural product, with an estimated annual economic impact of nearly $20 billion statewide. Helping the poultry industry maximize its return on every flock can translate to important dividends. The research is funded by the state of Georgia through the Agricultural Technology Research Program at GTRI.</p><p>Under the Intelligent Cutting and Deboning System, a bird is positioned in front of the vision system prior to making a cut, explained GTRI research engineer Michael Matthews. The vision system works by making 3-D measurements of various location points on the outside of the bird. Then, using these points as inputs, custom algorithms define a proper cut by estimating the positions of internal structures such as bones and ligaments.</p><p>"Our statistics research shows that our external measurements correlate very well to the internal structure of the birds, and therefore will transition to ideal cutting paths," Matthews said. "In our prototype device, everything is registered to calibrated reference frames, allowing us to handle all cut geometries and to precisely align the bird and the cutting robot. Being able to test all possible cut geometries should enable us to design a smaller and more simplified final system."</p><p>The prototype uses a fixed two-degree-of-freedom cutting robot for making simple planar cuts. The bird is mounted on a six-degree-of-freedom robot arm that allows alignment of the bird and cutting robot to any desired position. The robot arm places the bird under the vision system, and then it moves the bird with respect to the cutting robot.</p><p>The system employs a force-feedback algorithm that can detect the transition from meat to bone, said research engineer Ai-Ping Hu. That detection capability allows the cutting knife to move along the surface of the bone while maintaining a constant force.</p><p>Since ligaments are attached to bone, maintaining contact with the bone allows the knife to cut all the ligaments around the shoulder joint without cutting into the bone itself.&nbsp; A similar approach can be used for other parts of the bird where meat must be separated from bone.</p><p>Hu explained that the force-feedback algorithm uses a force sensor affixed to the knife handle. During a cutting operation, the sensor enables the robot to detect imminent contact with a bone. Then, instead of cutting straight through the bone, the system directs the cutting tool to take an appropriate detour around the bone.</p><p>"Fine tuning is needed to adjust the force thresholds, to be able to tell the difference between meat, tendon, ligaments and bone, each of which have different material properties,” Hu said.</p><p>McMurray said he expects the Intelligent Deboning System to match or exceed the efficiency of the manual process. Testing of the deboning prototype system, including cutting experiments, has confirmed the system’s ability to recognize bone during a cut and to avoid bone chips – thus demonstrating the validity of GTRI’s approach.</p><p>“There are some very major factors in play in this project,” McMurray said. “Our automated deboning technology can promote food safety, since bone chips are a hazard in boneless breast fillets. But it can also increase yield, which is significant because every 1 percent loss of breast meat represents about $2.5 million to each of Georgia’s 20 poultry processing plants.”</p><p><strong>Research News &amp; Publications Office</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>75 Fifth Street, N.W., Suite 314</strong><br /><strong>Atlanta, Georgia&nbsp; 30308&nbsp; USA</strong><br /><br /><strong>Media Relations Contacts</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>) or Abby Robinson (404-385-3364)(<a href="mailto:abby@innovate.gatech.edu">abby@innovate.gatech.edu</a>) or Kirk Englehardt (404-894-6015)(<a href="mailto:kirk.englehardt@comm.gatech.edu">kirk.englehardt@comm.gatech.edu</a>).</p><p><strong>Writer</strong>: Rick Robinson</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1338327826</created>  <gmt_created>2012-05-29 21:43:46</gmt_created>  <changed>1475896338</changed>  <gmt_changed>2016-10-08 03:12:18</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have developed an automated system for deboning poultry.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have developed an automated system for deboning poultry.]]></sentence>  <summary><![CDATA[<p>Researchers at the Georgia Tech Research Institute (GTRI) have developed a prototype system that uses advanced imaging technology and a robotic cutting arm to automatically debone chicken and other poultry products.</p>]]></summary>  <dateline>2012-05-29T00:00:00-04:00</dateline>  <iso_dateline>2012-05-29T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-05-29 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[System could help boost agricultural production, improve safety]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News &amp; Publications Office</p><p>(404) 894-6986</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>133131</item>          <item>133141</item>          <item>133151</item>      </media>  <hg_media>          <item>          <nid>133131</nid>          <type>image</type>          <title><![CDATA[Poultry Deboning System]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[poultry-deboning131.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/poultry-deboning131_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/poultry-deboning131_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/poultry-deboning131_0.jpg?itok=eLy7jW3u]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Poultry Deboning System]]></image_alt>                    <created>1449178659</created>          <gmt_created>2015-12-03 21:37:39</gmt_created>          <changed>1475894759</changed>          <gmt_changed>2016-10-08 02:45:59</gmt_changed>      </item>          <item>          <nid>133141</nid>          <type>image</type>          <title><![CDATA[Poultry Deboning System2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[poultry-deboning121.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/poultry-deboning121_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/poultry-deboning121_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/poultry-deboning121_0.jpg?itok=gIsO4tV3]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Poultry Deboning System2]]></image_alt>                    <created>1449178659</created>          <gmt_created>2015-12-03 21:37:39</gmt_created>          <changed>1475894759</changed>          <gmt_changed>2016-10-08 02:45:59</gmt_changed>      </item>          <item>          <nid>133151</nid>          <type>image</type>          <title><![CDATA[Poultry Deboning System3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[poultry-deboning58.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/poultry-deboning58_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/poultry-deboning58_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/poultry-deboning58_0.jpg?itok=uNVOcEUL]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Poultry Deboning System3]]></image_alt>                    <created>1449178659</created>          <gmt_created>2015-12-03 21:37:39</gmt_created>          <changed>1475894759</changed>          <gmt_changed>2016-10-08 02:45:59</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="669"><![CDATA[agriculture]]></keyword>          <keyword tid="11470"><![CDATA[Gary McMurray]]></keyword>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="6057"><![CDATA[image]]></keyword>          <keyword tid="215"><![CDATA[manufacturing]]></keyword>          <keyword tid="668"><![CDATA[poultry]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="820"><![CDATA[vision]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="128531">  <title><![CDATA[Robot Reveals the Inner Workings of Brain Cells]]></title>  <uid>27206</uid>  <body><![CDATA[<p>Gaining access to the inner workings of a neuron in the living brain offers a wealth of useful information: its patterns of electrical activity, its shape, even a profile of which genes are turned on at a given moment. However, achieving this entry is such a painstaking task that it is considered an art form; it is so difficult to learn that only a small number of labs in the world practice it.</p><p>But that could soon change: Researchers at MIT and the Georgia Institute of Technology have developed a way to automate the process of finding and recording information from neurons in the living brain. The researchers have shown that a robotic arm guided by a cell-detecting computer algorithm can identify and record from neurons in the living mouse brain with better accuracy and speed than a human experimenter.</p><p>The new automated process eliminates the need for months of training and provides long-sought information about living cells’ activities. Using this technique, scientists could classify the thousands of different types of cells in the brain, map how they connect to each other, and figure out how diseased cells differ from normal cells.</p><p>The project is a collaboration between the labs of Ed Boyden, associate professor of biological engineering and brain and cognitive sciences at MIT, and <a href="http://www.me.gatech.edu/faculty/forest.shtml" target="_blank">Craig Forest</a>, an assistant professor in the <a href="http://www.me.gatech.edu" target="_blank">George W. Woodruff School of Mechanical Engineering at Georgia Tech</a>.</p><p>“Our team has been interdisciplinary from the beginning, and this has enabled us to bring the principles of precision machine design to bear upon the study of the living brain,” Forest says. His graduate student, Suhasa Kodandaramaiah, spent the past two years as a visiting student at MIT, and is the lead author of the study, which appears in the May 6 issue of <a href="http://dx.doi.org/10.1038/nmeth.1993" target="_blank"><em>Nature Methods</em></a>.</p><p>The method could be particularly useful in studying brain disorders such as schizophrenia, Parkinson’s disease, autism and epilepsy, Boyden says. “In all these cases, a molecular description of a cell that is integrated with [its] electrical and circuit properties … has remained elusive,” says Boyden, who is a member of MIT’s Media Lab and McGovern Institute for Brain Research. “If we could really describe how diseases change molecules in specific cells within the living brain, it might enable better drug targets to be found.”</p><p><strong>Automation</strong></p><p>Kodandaramaiah, Boyden and Forest set out to automate a 30-year-old technique known as whole-cell patch clamping, which involves bringing a tiny hollow glass pipette in contact with the cell membrane of a neuron, then opening up a small pore in the membrane to record the electrical activity within the cell. This skill usually takes a graduate student or postdoc several months to learn.</p><p>Kodandaramaiah spent about four months learning the manual patch-clamp technique, giving him an appreciation for its difficulty. “When I got reasonably good at it, I could sense that even though it is an art form, it can be reduced to a set of stereotyped tasks and decisions that could be executed by a robot,” he says.</p><p>To that end, Kodandaramaiah and his colleagues built a robotic arm that lowers a glass pipette into the brain of an anesthetized mouse with micrometer accuracy. As it moves, the pipette monitors a property called electrical impedance — a measure of how difficult it is for electricity to flow out of the pipette. If there are no cells around, electricity flows and impedance is low. When the tip hits a cell, electricity can’t flow as well and impedance goes up.</p><p>The pipette takes two-micrometer steps, measuring impedance 10 times per second. Once it detects a cell, it can stop instantly, preventing it from poking through the membrane. “This is something a robot can do that a human can’t,” Boyden says.</p><p>Once the pipette finds a cell, it applies suction to form a seal with the cell’s membrane. Then, the electrode can break through the membrane to record the cell’s internal electrical activity. The robotic system can detect cells with 90 percent accuracy, and establish a connection with the detected cells about 40 percent of the time.</p><p>The researchers also showed that their method can be used to determine the shape of the cell by injecting a dye; they are now working on extracting a cell’s contents to read its genetic profile.</p><p>Development of the new technology was funded primarily by the National Institutes of Health, the National Science Foundation and the MIT Media Lab.</p><p><strong>New era for robotics</strong></p><p>The researchers recently created a startup company, Neuromatic Devices, to commercialize the device.</p><p>The researchers are now working on scaling up the number of electrodes so they can record from multiple neurons at a time, potentially allowing them to determine how different parts of the brain are connected.</p><p>They are also working with collaborators to start classifying the thousands of types of neurons found in the brain. This “parts list” for the brain would identify neurons not only by their shape — which is the most common means of classification — but also by their electrical activity and genetic profile.</p><p>“If you really want to know what a neuron is, you can look at the shape, and you can look at how it fires. Then, if you pull out the genetic information, you can really know what’s going on,” Forest says. “Now you know everything. That’s the whole picture.”</p><p>Boyden says he believes this is just the beginning of using robotics in neuroscience to study living animals. A robot like this could potentially be used to infuse drugs at targeted points in the brain, or to deliver gene therapy vectors. He hopes it will also inspire neuroscientists to pursue other kinds of robotic automation — such as in optogenetics, the use of light to perturb targeted neural circuits and determine the causal role that neurons play in brain functions.</p><p>Neuroscience is one of the few areas of biology in which robots have yet to make a big impact, Boyden says. “The genome project was done by humans and a giant set of robots that would do all the genome sequencing. In directed evolution or in synthetic biology, robots do a lot of the molecular biology,” he says. “In other parts of biology, robots are essential.”</p><p>Other co-authors include MIT grad student Giovanni Talei Franzesi and MIT postdoc Brian Y. Chow.&nbsp;</p><p><strong>Research News &amp; Publications Office<br /> Georgia Institute of Technology<br /> 75 Fifth Street, N.W., Suite 314<br /> Atlanta, Georgia 30308 USA</strong></p><p><strong>Media Relations Contacts:</strong> Abby Robinson (abby@innovate.gatech.edu; 404-385-3364) or Caroline McCall (cmccall5@mit.edu; 617-253-1682)</p><p><strong>Writer: </strong>Anne Trafton, MIT News</p>]]></body>  <author>Abby Vogel Robinson</author>  <status>1</status>  <created>1336328111</created>  <gmt_created>2012-05-06 18:15:11</gmt_created>  <changed>1475896329</changed>  <gmt_changed>2016-10-08 03:12:09</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have automated the process of finding and recording information from neurons in the living brain.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have automated the process of finding and recording information from neurons in the living brain.]]></sentence>  <summary><![CDATA[<p>Researchers have automated the process of finding and recording information from neurons in the living brain. A robotic arm guided by a cell-detecting computer algorithm can identify and record from neurons in the living mouse brain with better accuracy and speed than a human experimenter.</p>]]></summary>  <dateline>2012-05-06T00:00:00-04:00</dateline>  <iso_dateline>2012-05-06T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-05-06 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Abby Robinson<br /> Research News and Publications<br /> <a href="mailto:abby@innovate.gatech.edu">abby@innovate.gatech.edu</a><br /> 404-385-3364</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>128501</item>          <item>128521</item>          <item>128511</item>      </media>  <hg_media>          <item>          <nid>128501</nid>          <type>image</type>          <title><![CDATA[Craig Forest robotic neural recordings]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[forest_autopatching_hires.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/forest_autopatching_hires_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/forest_autopatching_hires_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/forest_autopatching_hires_0.jpg?itok=g41Ushq_]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Craig Forest robotic neural recordings]]></image_alt>                    <created>1449178622</created>          <gmt_created>2015-12-03 21:37:02</gmt_created>          <changed>1475894751</changed>          <gmt_changed>2016-10-08 02:45:51</gmt_changed>      </item>          <item>          <nid>128521</nid>          <type>image</type>          <title><![CDATA[Whole-cell patching robot schematic]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autopatching_schematic_hires.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autopatching_schematic_hires_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/autopatching_schematic_hires_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autopatching_schematic_hires_0.jpg?itok=CrkDQ7zV]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Whole-cell patching robot schematic]]></image_alt>                    <created>1449178622</created>          <gmt_created>2015-12-03 21:37:02</gmt_created>          <changed>1475894751</changed>          <gmt_changed>2016-10-08 02:45:51</gmt_changed>      </item>          <item>          <nid>128511</nid>          <type>image</type>          <title><![CDATA[Neuromatic Devices research team]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autopatching_team_hires.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autopatching_team_hires_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/autopatching_team_hires_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autopatching_team_hires_0.jpg?itok=YpLCoG_m]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Neuromatic Devices research team]]></image_alt>                    <created>1449178622</created>          <gmt_created>2015-12-03 21:37:02</gmt_created>          <changed>1475894751</changed>          <gmt_changed>2016-10-08 02:45:51</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1912"><![CDATA[brain]]></keyword>          <keyword tid="32681"><![CDATA[brain cell]]></keyword>          <keyword tid="594"><![CDATA[college of engineering]]></keyword>          <keyword tid="12333"><![CDATA[Craig Forest]]></keyword>          <keyword tid="32711"><![CDATA[electrical activity]]></keyword>          <keyword tid="7276"><![CDATA[neuron]]></keyword>          <keyword tid="1304"><![CDATA[neuroscience]]></keyword>          <keyword tid="32691"><![CDATA[patch clamp]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="167377"><![CDATA[School of Mechanical Engineering]]></keyword>          <keyword tid="32701"><![CDATA[whole-cell patch clamping]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="120151">  <title><![CDATA[Georgia Tech Innovations Help Expand U.S. Industrial Capabilities and Enhance Competitiveness]]></title>  <uid>27303</uid>  <body><![CDATA[<p>In a bustling laboratory at the Fuller E. Callaway Jr. Manufacturing Research Center, a researcher from the Georgia Tech School of Mechanical Engineering is using novel digital technology to cast complex metal parts directly from computer designs, dramatically reducing both development and manufacturing time.</p><p>Nearby, at the School of Industrial and Systems Engineering, researchers are working with a large U.S. avionics maker to speed new product production using specialized software that automatically generates simulations of the manufacturing process. And across campus in the College of Architecture, a team is working with an international corporation on digital techniques that allow entire concrete walls to be custom-manufactured to architectural specifications.</p><p>The Georgia Institute of Technology was founded in 1885 with a mandate to develop manufacturing capabilities in the state of Georgia. Today, researchers whose work directly supports manufacturers can be found throughout Georgia Tech’s academic colleges; in the Georgia Tech Research Institute, which focuses on applied research; and in the Enterprise Innovation Institute, which assists business and industry.</p><p>Georgia Tech’s role in supporting industry was highlighted in June 2011 when President Barack Obama named Georgia Tech President G.P. “Bud” Peterson to the steering committee of the Advanced Manufacturing Partnership (AMP). Georgia Tech joined five other leading universities – the Massachusetts Institute of Technology, Carnegie Mellon University, Stanford University, the University of California Berkeley and the University of Michigan – in the AMP’s $500 million push to guide investment in emerging technologies, increase overall U.S. global competitiveness and boost the supply of high-quality manufacturing jobs.</p><p>“We applaud this initiative, and Georgia Tech is honored to collaborate to identify ways to strengthen the manufacturing sector to help create jobs in Georgia and across the United States,” Peterson said. “Many of our challenges can be solved through innovation and fostering an entrepreneurial environment, as well as collaboration between industry, education and government to create a healthy economic environment and an educated workforce.”</p><p>Advanced manufacturing involves not only new ways to manufacture existing products, but also the development of new products emerging from advanced technologies, observed Stephen E. Cross, Georgia Tech’s executive vice president for research.</p><p>“Georgia Tech’s mandate has always been to support manufacturing and technology development in the state and in the nation – to conduct research with relevance – so supporting industry comes very naturally to us,” Cross said. “The leading-edge research across the Institute combines thought leadership with a focus on real-world problems and opportunities. Through this we will help lead a renaissance in advanced manufacturing in the United States.”</p><p>The university’s research initiatives on behalf of manufacturers are many and varied. These efforts include multiple areas of manufacturing-related research and involve collaboration across a variety of disciplines.</p><p><strong>Developing Novel Manufacturing Technologies</strong></p><p><em>Advancing Digital Manufacturing</em> --&nbsp;Suman Das, a professor in the George W. Woodruff School of Mechanical Engineering, has developed a technology that could transform how industry creates and produces complex metal parts through “lost wax” investment casting. In an ambitious project sponsored by the Defense Advanced Research Projects Agency (DARPA), he has created an all-digital approach that automates how part designs are turned into the real thing.</p><p>Currently, such metal parts are devised on computers using computer-aided design (CAD) software. But the next step – creating the ceramic mold with which the part is cast – involves a complex 12-step process that uses hundreds of tooling pieces and extensive manual labor. The result is a lengthy, costly and low-yield process that typically produces many scrap parts along with a few usable ones, said Das, who directs the Direct Digital Manufacturing Laboratory in Georgia Tech’s Manufacturing Research Center (MaRC).</p><p>By contrast, the approach used by Das involves building ceramic molds directly from a CAD design. Called large area maskless photopolymerization (LAMP), this high-resolution, direct digital manufacturing technology builds the molds, layer by layer, by projecting patterns of ultraviolet light onto a mixture of photosensitive resins and ceramic particles.</p><p>After a mold is formed, it is thermally post-processed at high temperatures to burn away the polymer and sinter the ceramic particles. That process forms a structure into which molten metal can be poured for casting.</p><p>“The LAMP process can reduce the time required to turn a CAD design into a test-worthy part from several months to about a week, and it can produce parts of a complexity that designers could only dream of before,” Das said. “It also can reduce costs by 25 percent and the number of unusable waste parts by more than 90 percent, while eliminating 100 percent of the tooling.”</p><p>Das is currently working with turbine-engine airfoils – complex parts used in aircraft jet engines – in collaboration with the University of Michigan, PCC Airfoils and Honeywell International Inc. He believes LAMP technology will become pervasive and will be effective in the production of many other types of metal parts.</p><p>Das said that LAMP can create not only testable prototypes, but could also be used in the actual manufacturing process, facilitating the mass production of complex metal parts at lower costs in a variety of industries.</p><p>A prototype LAMP alpha machine is currently building six typical airfoil molds in six hours. Das predicts that a larger beta machine – currently being built at Georgia Tech and scheduled for installation at a PCC Airfoils facility in Ohio in 2012 – will produce 100 molds in about 24 hours.</p><p>“When you can achieve those volumes, you have gone beyond rapid prototyping to true rapid manufacturing,” he said.</p><p><em>Customizing Building Components</em> --&nbsp;Researchers at the College of Architecture are also helping to automate the process of turning CAD designs into manufactured products. A team in the Digital Building Laboratory is collaborating with Lafarge North America to develop ways to manufacture customized wall structures directly from parametric digital models.</p><p>The new process involves custom-molding entire curtain walls from rubber negatives to produce a unitized system called the “Liquid Wall,” constructed with Ductal®, Lafarge’s ultra-high-performance concrete (UHPC), and stainless steel. The Liquid Wall, created by Peter Arbour of RFR Consulting Engineers and collaborator Coreslab Structures Inc., won the 2010 AIANY Open Call for Innovative Curtain-Wall Design.</p><p>“We don’t want to just pick standardized products out of catalogs anymore,” said Tristan Al-Haddad, an assistant professor in the College of Architecture who is involved in the collaboration with Lafarge, along with assistant professor Minjung Maing and others. “We’re developing the protocols and research to manufacture high-end customized architectural products economically, safely and with environmental responsibility.”</p><p>The Liquid Wall approach is challenging, explained professor Charles Eastman, who is director of the Digital Building Laboratory and has a joint appointment in the College of Computing. The process involves creating rubber negatives using wall-form designs created with parametric modeling software, then planning production procedures and mapping out ways to install the completed, full-size walls on actual buildings.</p><p>“When you’re creating a completely new process like the Liquid Wall, you’re faced with developing a whole new manufacturing process for this kind of material,” Eastman said.</p><p><em>Individualizing Mass Production</em> --&nbsp;Industrial designer Kevin Shankwiler, an associate professor in the College of Architecture, creates objects that can be both customized and mass-produced. By utilizing advances in flexible manufacturing technology, Shankwiler and his students develop furniture designs that can be changed to meet individual needs – such as those of persons with disabilities – while being built cost-effectively using mass production methods.</p><p>Today’s designers can build responsiveness to individual needs into the computer models used in production, Shankwiler said. Current manufacturing methods – such as computer-numerically-controlled (CNC) and 3-D printing techniques – are capable of creating furniture and other goods that can meet users’ specific requirements without resorting to an institutional look.</p><p>“In one research effort, we took a dining room chair in the Craftsman style, and we designed and built a model that could accommodate both wheelchair users of differing abilities and fully ambulatory people,” Shankwiler said. “We have to ask – how should the human need affect the manufactured output and what are the best methods for achieving that?”</p><p><em>Pursuing Micro-scale Machining</em> --&nbsp;J. Rhett Mayor, an associate professor in the School of Mechanical Engineering, is investigating techniques that allow effective machining of metal surfaces at 50 microns – one 2,000ths of an inch – or less. He is also developing unique applications based on advanced micro-machining, such as tiny channels in metal that enhance heat transfer between surfaces.</p><p>At present, Mayor explained, the ability to cut micro-features into surfaces is limited to metal sections about 1 centimeter square, a size that offers little cooling capability. Research being conducted by Mayor and his group focuses on scaling up micro-machining capabilities so that micro features can be cut in larger metal sheets.</p><p>“We can currently make hundreds of features on a square centimeter,” Mayor said. “What we need are millions of features on a square foot.”</p><p>One type of micro-scale feature – micro-channel heat exchangers – could play an important role in cooling factory-floor devices, as well as in the development of closed-loop systems that could generate power using recycled heat. For example, today’s factories typically use large electrical motors that vent their heat inside the plant, wasting energy.</p><p>In related work, Mayor and his team are developing optimization routines and thermal models that could enhance electrical machine design through the application of micro-machining and other technologies. The aim is to create machines that are smaller, yet offer high energy outputs thanks to more efficient cooling and to energy recycling.</p><p>Another application of large scale micro-machining could involve the development of lightweight electric actuators that would take the place of hydraulics in aircraft. Such electric actuators would need plenty of power to replicate the high torque provided by hydraulics; those power requirements would demand effective cooling strategies.</p><p><strong>Tackling Issues on the Factory Floor</strong></p><p><em>Promoting Factory Robotics</em> --&nbsp;Henrik Christensen, a professor in the College of Computing, is working with the Boeing Company to advance robotic manufacturing in the aircraft maker’s facilities.</p><p>In one project, Christensen and his team are working on an initiative that makes fundamental changes to how pieces are handled on the factory floor. In this approach, robots reverse the standard procedure by moving processing machines to a given part, rather than moving the part through an assembly line.</p><p>“Think of a large airplane structure,” Christensen said. “Having a machine move along the body of the aircraft, rather than moving the body itself, could result in much more efficient use of the machine.”</p><p>The team is employing a movable platform in the MaRC building that supports a robotic processing machine. Tests have already been performed using mobile painting and drilling capabilities that could lead to similar implementations at Boeing facilities.</p><p>Christensen has also developed automation technology that helps Boeing inspect parts and sub-assemblies that arrive from suppliers. The mobile robotic system scans each arriving piece to confirm that it is the correct item and conforms to the stipulated dimensions.</p><p>The technology allows Boeing to identify shipping errors almost immediately, before the mistake can delay production. It also saves on labor costs and allows workers to be assigned to less routine tasks.</p><p>The Boeing projects are part of the Aerospace Manufacturing Initiative (AMI), which was established in 2008 when Boeing identified Georgia Tech as a strategic university partner and agreed to collaborate on innovative manufacturing technologies for aerospace products. The AMI, which involves multiple research projects across Georgia Tech, is led by Steven Danyluk, who is the Morris M. Bryan Jr. Chair in Mechanical Engineering for Advanced Manufacturing Systems. Since 2008, Siemens USA and CAMotion Inc. have also become AMI participants.</p><p>In another project just getting launched with a major French manufacturing company, Christensen is pursuing novel technology that would allow a factory-floor robot to learn tasks via direct human demonstration. Rather than having each robotic operation mapped out laboriously on a control computer, a worker would demonstrate the optimal way to perform a job and the robot would then mimic the human.</p><p>This human-model approach to robotic learning could have applications across a number of industries, he added; both Boeing and General Motors have expressed interest in the technology. Other application areas for this technique include health care and biotechnology, where it could help automate both manufacturing procedures and laboratory testing.</p><p><em>Improving Online Production</em> --&nbsp;Jianjun (Jan) Shi, a professor in the H. Milton Stewart School of Industrial and Systems Engineering (ISYE), conducts research that addresses system informatics and control. He uses his training in mechanical and electrical engineering to integrate system data – comprising design, manufacturing, automation and performance information – into models that seek to reduce process variability.</p><p>In one effort, Shi is working with nGimat Co., a Norcross, Ga.- based company that is currently evaluating ways to mass produce a type of nanopowder used in high-energy, high-density batteries for electric cars. With sponsorship from the Department of Energy (DOE), Shi is supporting nGimat as it works to increase nanopowder output by several orders of magnitude.</p><p>“This product has very good characteristics, and the task here is to scale up production while maintaining the quality,” said Shi, who holds the Carolyn J. Stewart Chair in ISyE. “We must identify the parameters – what to monitor, what to control – to reduce any variability, and do so in an environmentally friendly way.”</p><p>In work focusing on the steel industry, Shi is pursuing multiple projects including the investigation of sensing technologies used to monitor very high temperature environments in steel manufacturing. With DOE support, he is working with OG Technologies Inc. to develop methods that use optical sensors to provide continuous high-speed images of very hot surfaces – between 1,000 and 1,450 degrees Celsius.</p><p>“We want to catch defect formation in the very early stages of manufacturing,” Shi said. “By using imaging data of the product effectively with other process data to eliminate defects, we can help optimize the casting process.”</p><p>In another project, sponsored by the National Science Foundation (NSF), Shi is investigating ways to use process measurements and online adjustments to improve quality control in the manufacturing of the silicon wafers used in semiconductors. He is working with several manufacturers to examine the root causes of undesirable geometric defects in wafer surfaces.</p><p><em>Anticipating System Failure</em> --&nbsp;Nagi Gebraeel, an associate professor in the School of Industrial and Systems Engineering, conducts research in detecting and preventing failure in engineering systems as they degrade over time. The goal is to avoid both expensive downtime and unnecessary maintenance costs.</p><p>“We could be talking about a fleet of aircraft, trucks, trains, ships – or a manufacturing system,” Gebraeel said. “In any of these cases, it’s extremely useful for numerous reasons to be able to accurately estimate the remaining useful lifetime of a system or its components.”</p><p>With National Science Foundation (NSF) funding, Gebraeel has examined some of the key challenges in accurately predicting failures of complex engineering systems. Specific challenges include the ability to account for the uncertainty associated with degradation processes of these systems and their components, the effects of future environmental/operational conditions, and the dependencies and interactions that exist in multi-component systems.</p><p>In one project, Gebraeel and his team worked with Rockwell Collins, a maker of avionics and electronics, to monitor and diagnose the performance of circuit boards that control vital aircraft communications systems.</p><p>With equipment funding provided by Georgia Tech, Gebraeel has developed an adaptive prognostics system (APS), a custom research tool that allows him to investigate how quickly components degrade under stresses, using sensor-detected signals such as vibration.</p><p>“There’s a real need for information about the remaining life of components, so that users can find the economical middle ground between the cost of scheduled replacements and the cost of failure,” he said.</p><p><em>Maximizing Throughput with Software</em> --&nbsp;Three faculty members in the School of Industrial and Systems Engineering – Shabbir Ahmed, George Nemhauser and Joel Sokol – recently completed a project supporting a major maker of float glass. The manufacturer was automating a process in which finished glass plates are packed for shipment.</p><p>The company was concerned that new machines – which pick up and remove glass from the production line – might fall behind, allowing valuable plates to be damaged. They wanted the capability to carefully schedule production sequences so the machines could function at maximum capacity without wasting plates.</p><p>The team tackled development of new software that could minimize production problems. They devised algorithms that allowed the machines to work at their maximum efficiency and enabled them to handle input data with more than 99 percent efficiency.</p><p>“The algorithms we delivered can also be used strategically, to determine how many machines of each type should be installed on a production line,” Sokol said.</p><p>Sokol, Nemhauser and Ahmed are also collaborating on a project with a large international corporation to support production throughput at a semiconductor manufacturing facility.</p><p>The challenge involves the physical movement of semiconductors from one processing station to another throughout the factory. Because the routing of semiconductors between processing machines can differ from item to item, there’s no linear assembly line procedure; instead, hundreds of automated vehicles pick up items from one processing point and move them to the next step.</p><p>Due to the facility’s layout, these automated vehicles often encounter congestion that can delay the production schedule, said Nemhauser, who is the A. Russell Chandler lll Chair and Institute professor. The team is developing methods to best route and schedule the vehicles to minimize congestion and to move items between machines in ways that don’t delay production.</p><p><em>Increasing Manufacturing Precision</em> --&nbsp;Shreyes Melkote, who is the Morris M. Bryan Jr. professor in mechanical engineering, directs the Precision Machining Research Center, one of numerous centers based in MaRC. Melkote researches precision manufacturing issues in several areas, including the production of precision metal parts and photovoltaic substrates.</p><p>In a project sponsored by The Timken Company, Melkote is investigating methods for faster and more efficient machining of hardened steel materials using a hybrid process called “Laser Assisted Hard Machining.” Results from successful machining trials have demonstrated that this hybrid process has the potential to reduce machining time as well as cutting tool cost by prolonging tool life.</p><p>In a Boeing-sponsored project, Melkote is developing thin-film sensors capable of monitoring high-speed machining operations. The goal is to give operators in-depth feedback for more effective control of high-speed rotating machines used to produce aerospace parts.</p><p>Traditional piezoelectric sensors are costly and unreliable, Melkote said, and installing them on a given machine can alter its dynamic characteristics. By contrast, sensors made from low-cost piezoelectric polymer film can be attached to a rotating device without affecting its operation. A patent application is being filed on this sensor technology.</p><p>“Thin-film sensors allow us to accurately measure what’s happening between the tool and the work-piece, in terms of forces, vibrations, deflections and other process responses,” he said. “We have demonstrated that the quality of information we are getting from a $200 sensor is as good as from one that costs $30,000.”</p><p><strong>Innovations in Manufacturing Systems and Processes</strong></p><p><em>Automating Manufacturing Simulations</em> --&nbsp;Professor Leon McGinnis of the School of Industrial and Systems Engineering focuses on model-based systems engineering, an approach that uses computational methods to enable capture and reuse of systems knowledge. McGinnis is pursuing several sponsored projects in this area.</p><p>In one effort, McGinnis and his team have been working with Rockwell Collins, a maker of avionics and electronics, to help speed the introduction of new products by automating a process that simulates the requirements of production.</p><p>To optimize the resources needed to make products at the required rate, McGinnis explained, Rockwell Collins creates a computerized simulation of the manufacturing processes. Development of these models has traditionally been the province of experts skilled in taking initial system designs and painstakingly translating them into simulations of actual production.</p><p>“This is not a trivial task – producing a simulation model requires some 100 to 200 hours per product,” said McGinnis, who is associate director of MaRC. “The company was only able to generate a few production models at a time, which created something of a bottleneck.”</p><p>To understand the process of developing simulation models, a team interviewed the Rockwell Collins experts on the methods they used to develop such models. Then the Georgia Tech researchers turned to SysML, a programming language that enables the computerized modeling of complex systems, including multiple related factors such as people, machinery and product flows.</p><p>By using SysML to describe the evolution of a given product, the researchers were able to automate its movement from design to simulation. Even more important, the team created a domain-specific version of SysML that was customized to the Rockwell Collins environment. That achievement allowed any of the company’s new products and systems to be plugged into a SysML-based automation process.</p><p>This new way of doing things appears to reduce the time required to build simulation models by an order of magnitude, said McGinnis, who leads the Model-Based Systems Engineering Center in MaRC.</p><p>In another project, McGinnis and his team are collaborating with the School of Mechanical Engineering and MaRC to develop semantics for manufacturing processes under a DARPA contract. In other work, McGinnis is collaborating with the Tennenbaum Institute – a Georgia Tech organization that supports research for enterprise transformation – to address the challenges of identifying and mitigating risks in global manufacturing enterprise networks.</p><p><em>Developing Future Factories</em> --&nbsp;A research team from the Georgia Tech Research Institute (GTRI) is working with the General Motors Co. to develop novel sensor and computer technologies for manufacturing.</p><p>The project, known as the Factory of the Future, seeks to establish a manufacturing model based on approaches and technologies that are largely new to factory design and processes. Among other things, the researchers are investigating the use of biologically inspired software algorithms to help maximize plant floor efficiency.</p><p>“The future factory is one with an extremely agile environment, allowing the manufacturing plant to be reconfigured in real time to meet the objectives for production,” said Gisele Bennett, director of the Electro-Optical Systems Laboratory at GTRI.</p><p>At the heart of this process improvement approach is a robust combination of sensor and intelligent algorithm technologies, said Bennett, who is leading the project. The resulting optimization algorithms would utilize asset visibility of supplies, machines and vehicle-assembly status to optimize the manufacturing process, based on current requirements that could include energy savings, throughput or cost.</p><p>The goal is a broad, centralized view of all aspects of the manufacturing process, available in real time. This big-picture capability could lead to greater efficiency and productivity due to improved routing, inventory control and visibility into the health of the manufacturing equipment.</p><p>“Among other things, these techniques could support a capability for just-in-time car building,” Bennett said. “A consumer could go into a dealership, choose the car they wanted – and as soon as the car is specified, its assembly would begin remotely.”</p><p><em>Advancing the Adaptive Process</em> --&nbsp;A multidisciplinary team of Georgia Tech researchers is taking part in the Adaptive Vehicle Make (AVM) program. The four-year DARPA program, announced in the first half of 2011, fosters novel approaches to the design, verification and manufacturing of complex defense systems and vehicles. Funding for Georgia Tech’s share of the work is expected to exceed $10 million.</p><p>The AVM effort consists of three primary programs: META, Instant Foundry Adaptive through Bits (iFAB) and Fast Adaptable Next-Generation Ground Vehicle (FANG). FANG includes the vehicleforge.mil project and the Manufacturing Experimentation and Outreach (MENTOR) effort.</p><p>Georgia Tech is collaborating with Vanderbilt University on the META program and the related Component, Context, and Manufacturing Model Library (C2M2L) program. Led by professor Dimitri Mavris, director of the Aerospace Systems Design Lab, and research engineer Johanna Ceisel, Georgia Tech’s META effort focuses on dramatically improving the existing systems engineering, integration and testing processes for defense systems.</p><p>Rather than utilizing one particular alternative technique, metric or tool, META aims to develop model-based design methods for cyber-physical systems that are far more complex and heterogeneous than those in use today.</p><p>Shreyes Melkote, a professor in the School of Mechanical Engineering, leads an iFAB team that is developing manufacturing-process capabilities and model libraries to enable automated planning for the design and manufacture of military ground vehicles.</p><p>A GTRI team led by Vince Camp is also supporting iFAB, providing process guidance for development of the libraries. In addition, researchers from four Georgia Tech units, along with companies InterCAX LLC and Third Wave Systems Inc., are supporting this iFAB effort.</p><p>The vehicleforge.mil project, led by GTRI researchers Jack Zentner and Nick Bollweg, is creating a secure central website and other web-based tools capable of supporting collaborative vehicle development. The core website – vehicleforge.mil – would allow individuals and teams to share data, models, tools and ideas to speed and improve the design process.</p><p>“The aim here is to fundamentally change the way in which complex systems are taken from concept to reality,” said Zentner, a senior research engineer. “By enabling many designers in varied locations to work together in a distributed manner, we’re confident that vehicles – and eventually other systems – can be developed with greater speed and better results.”</p><p>The C2M2L model library is part of the overall effort. C2M2L seeks to develop domain-specific models to enable the design, verification and fabrication of the FANG infantry fighting vehicle using the META, iFAB and vehicleforge.mil infrastructure.</p><p>The MENTOR effort will engage high school-age students in a series of collaborative design and distributed manufacturing prize-challenge experiments, with the goal of inspiring America’s manufacturing and technology workforce of tomorrow.</p><p>DARPA envisions that the prize challenges will include up to 1,000 high schools in teams distributed across the nation and around the world, using computer-numerically-controlled (CNC) additive manufacturing machines – also known as 3D printers. The goal is help students collaboratively design and build systems of moderate complexity, such as mobile ground and aerial robots and energy systems.</p><p>MENTOR is led by professor Daniel Schrage of the School of Aerospace Engineering and director of the Integrated Product Lifecycle Engineering Laboratory, and by professor David Rosen of the School of Mechanical Engineering, who is also director of the Rapid Prototyping &amp; Manufacturing Institute in MaRC.</p><p><em>Strengthening Supply Chains</em> --&nbsp;Vinod Singhal, who is the Brady Family Professor of Operations Management in the College of Management, investigates supply chain disruptions and their relation to corporate performance. In one project, he is evaluating recent disruptions at manufacturing companies and other businesses, where he documents the magnitude of drop in stock prices, loss of revenue and increase in costs due to supply chain disruptions.</p><p>“Traditional approaches to supply chain management have focused only on efficiency,” Singhal said. “Newer approaches involve avoiding value destruction by instituting a reliable, responsive and robust supply chain.”</p><p>Singhal has developed a detailed framework that helps enterprises manage their supply chain risks. His research instructs companies on how to prioritize risks, making supply chain vulnerabilities more visible and ensuring that top management learns to recognize the issue as critical to corporate success.</p><p><em>Modeling Flexibility</em> --&nbsp;In the College of Management, Regents’ professor Cheryl Gaimon studies technology management in manufacturing and service enterprises. In one study, Gaimon and former Ph.D. student Alysse Morton analyzed the value of flexibility in high-volume manufacturing of products with short life cycles, such as computer components.</p><p>The researchers developed a model showing how companies could link internal manufacturing capabilities with swiftly changing external market forces. They demonstrated how these businesses could exploit manufacturing efficiencies, early market entry and quick shifts between product generations, combined with optimal pricing policies.</p><p>“Our results demonstrated that firms need to work closely with their equipment suppliers to achieve more flexible technology, and that even a less-efficient facility can realize a long-term competitive advantage through an earlier market-entry strategy,” Gaimon said.</p><p><em>Lowering Quality-Failure Impact</em> --&nbsp;Assistant Professor Manpreet Hora of the College of Management conducts research in several areas of business and manufacturing, including the recall of products such as automobiles. In a recent study, he looked at the risks that can sometimes be created by today’s lean manufacturing methods.</p><p>In studying automotive recalls, Hora discovered that because companies often share components across multiple vehicle lines to maintain lean practices, a potential defect in such components can greatly increase the cost and the magnitude of a recall. He concluded that increased quality checks of shared and critical parts are essential in lowering the impact of quality failures from recalls.</p><p><strong>Helping Manufacturers Improve Products</strong></p><p><em>Reducing Engine Noise</em> --&nbsp;In a project sponsored by EADS North America, a large aerospace and defense company, GTRI researcher Jason Nadler tackled the problem of helping the manufacturer reduce noise produced by commercial and military jet aircraft.</p><p>Nadler and his team used innovative materials that make possible a new approach to the physics of noise reduction. They found that honeycomb-like structures composed of many tiny tubes or channels can reduce sound more effectively than conventional methods.</p><p>“This approach dissipates acoustic waves by essentially wearing them out,” Nadler said. “It’s a phenomenological shift, fundamentally different from traditional techniques that absorb sound using a more frequency-dependent resonance.”</p><p>Nadler’s research involves broadband acoustic absorption, a method of reducing sound that doesn’t depend on frequencies or resonance. Instead of resonating, sound waves plunge into the channels and dissipate through a process called viscous shear.</p><p>He has developed what could be the world’s first superalloy micro honeycomb using a nickel-based superalloy. He estimates that this new approach could provide better sound attenuation than any acoustic liner currently available.</p><p><em>Improving Poultry Production</em> --&nbsp;The Food Processing Technology Division of GTRI performs a broad spectrum of research for the food industry, including numerous projects that support the state’s nearly $20 billion poultry industry. Research areas include advanced imaging and sensor technologies; robotics and automation systems; environmental and biological systems; food and product safety research; and worker safety research.</p><p>In one project, GTRI researchers are employing image processing, statistical modeling, modeling of biomaterials and high-speed force control to bring automated chicken deboning to poultry processors. The Intelligent Deboning System aims to match or exceed the efficiency of the manual process.</p><p>Initial tests of the deboning prototype system, including cutting experiments, have shown the system’s ability to recognize bone during a cut and thus avoid bone chips. The work has demonstrated the validity of GTRI’s approach.</p><p>“There are some very major factors in play in this project,” said Gary McMurray, chief of the Food Processing Technology Division and project director. “These include food safety – because bone chips are a major hazard for boneless breast fillets – and yield, because every 1 percent loss of breast meat represents about $2.5 million to each of Georgia’s 20 processing plants.”</p><p><em>Controlling Baking Systems</em> --&nbsp;GTRI has developed a production line system that automatically inspects the quality of sandwich buns exiting the oven and adjusts oven temperatures if it detects unacceptable products.</p><p>Working with baking company Flowers Foods and AMF/BakeTech, a baking equipment manufacturer, GTRI researchers Douglas Britton and Colin Usher have tested their industrial-quality prototype system. Made of stainless steel, the system is dust-and-water-resistant, and mounts on existing conveyor belts as wide as 50 inches.</p><p>The researchers tested the system in a Flowers Foods bakery.</p><p>“We have closed the loop between the quality inspection of buns and the oven controls to meet the specifications required by food service and fast-food customers,” said Britton. “By creating a more accurate, uniform and faster assessment process, we are able to minimize waste and lost product.”</p><p><em>Testing Manufacturing Materials</em> --&nbsp;The GTRI Materials Analysis Center (MAC), led by Lisa Detter-Hoskin, supports manufacturers and other groups using advanced analytical tools and methodologies that address materials characterization, failure analysis and corrosion issues for manufacturers and other companies. MAC annually manages research projects and evaluates samples for hundreds of corporations and agencies.</p><p>For example, the center supports CE-Tech LLC of Alpharetta, Ga., in numerous areas, including conducting analyses of competitive products and resins. The objective is to lower raw-material costs for CE-Tech clients through the substitution of lower-cost resins.</p><p>In another instance, GTRI works with Fairfield, Conn.-based Acme United Corp., a maker of cutting, measuring and safety products, to evaluate the chemistry and structure of new surface coatings. In one project, GTRI personnel tested a proprietary Acme United physical vapor deposition technology used to impart a hard outer shell onto steel blades.</p><p>“We frequently need to test,” said Larry Buchtmann, vice president for technology for Acme United. “GTRI has the specialized equipment and trained engineering staff to meet our ongoing needs for these services.”</p><p><em>Assessing Advanced Electronics</em> --&nbsp;GTRI’s Electromagnetic Test and Evaluation Facilities (EMTEF) and Electromagnetic Phenomenology Laboratory test facilities provide ongoing research and support for manufacturers. Both commercial customers and the U.S. government use these assets to aid design and manufacture of antennas and antenna-related sensors for wireless systems, cell and base station antennas, aircraft antennas and related applications.</p><p>“These multi-purpose ranges allow antenna manufacturers or design engineers to confirm modeling designs, diagnose performance problems, and to confirm performance against advertised specifications,” said GTRI researcher Barry Mitchell.</p><p>In one past instance, Mitchell recalls, a maker of aircraft weather radar was encountering problems with false alarms coming from wind-shear detection systems in flight. A GTRI team tested a waveguide antenna array on a planar near-field range belonging to the research institute, and the resulting aperture holograms revealed leakage points from brazed joints on the array. Eventually the problem was traced to a defect in the dip-brazing process during manufacturing, enabling corrective measures.</p><p><strong>Making Manufacturing More Sustainable</strong></p><p><em>Supporting Sustainable Manufacturing</em> --&nbsp;School of Mechanical Engineering professor Bert Bras, who leads the Sustainable Design and Manufacturing (SDM) Program in the MaRC, focuses on reducing the environmental impact of materials, products and manufacturing processes, while increasing their competitiveness.</p><p>The SDM group gets a large share of its research funding from industry. Together with MaRC research engineer Tina Guldberg, Bras and his group are currently working with Ford, GM and Boeing on projects related to sustainable manufacturing. Much of their work centers on a better understanding of the overall effect of manufacturing operations, as well as potential unintended consequences of product, process and business decisions over their life cycle.</p><p>One technique developed by Bras and his students involves the inclusion of environmental impact measures such as energy and water consumption in activity-based cost models. In this way, a single assessment model can quantify financial and environmental consequences of manufacturing process choices.</p><p>With Marc Weissburg, a professor in the School of Biology and co-director of the Center for Bio-Inspired Design, Bras and his team are working on an NSF-funded project focused on the role of biologically inspired design in industrial manufacturing networks.</p><p>Bras is also collaborating with professor Nancey Green Leigh of the School of City and Regional Planning and professor Steven French of the College of Architecture on an NSF-funded project that studies methods of boosting product and material recovery in urban areas for use in local manufacturing. Leigh and French are also focusing in this grant on quantifying the amount of carpet and electronic waste generated in a metropolitan area and the economic benefits of diverting it from landfills, thereby creating business and job opportunities.</p><p><em>Recovering and Reusing Waste</em> --&nbsp;Jane Ammons, who is the H. Milton and Carolyn J. Stewart School Chair in the School of Industrial and Systems Engineering, collaborates on reverse production systems with Matthew Realff, a professor in the School of Chemical &amp; Biomolecular Engineering. For more than 10 years, the team has focused on two important areas: the recovery and reuse of carpet wastes and ways to reduce electronic waste.</p><p>Ammons, Realff and their teams have developed a mathematical framework to support the growth of used-carpet collection networks. Such networks could help to recycle much of the 3.4 billion pounds of carpet waste currently produced in the United States annually. Research indicates that successful reuse of that carpet has a potential value of at least $850 million, versus a disposal cost of at least $60 million for simply sending it to landfills.</p><p>In other work, the team is studying the problem of e-waste – unwanted electronic components such as televisions, monitors and computer boards and chips. The e-waste stream includes hazardous materials such as lead and other toxins, yet effective management and reuse of e-components can be profitable. Ammons and Realff have devised mathematical models that address the complexities of e-waste processing, with the goal of helping recycling companies stay economically viable.</p><p><em>Promoting Manufacturing Sustainability</em> --&nbsp;In a recent project, associate professor Chen Zhou in the School of Industrial and Systems Engineering, working with professor Leon McGinnis, tackled sustainability issues for a major U.S. manufacturer. The issue involved shipping gearbox components from China to the United States in ways that would minimize not only cost but also greenhouse gas emissions and waste.</p><p>It turned out that packaging was at the heart of the issue. The researchers had to configure component packaging so that the maximum number of components could be placed in a cargo container, yet also allow for optimal recycling of the packing materials to avoid waste and unnecessary cost.</p><p>“This was definitely a complex problem,” Zhou said. “You must track every piece of packaging from its source to its final resting place, when it either goes into another product or into a landfill.”</p><p>The team created a model – a globally sourced auto parts packaging system – that optimized cargo container space. The model also enabled the use of packing materials that were fully reusable; some materials went back to China for use in future shipments, while the rest was recycled into plastics for new vehicles.</p><p>Clearly, Georgia Tech’s broad-based involvement in advanced manufacturing research reflects both the talents of its faculty and the determination of U.S. industry to reinvent itself with the help of university-based research.</p><p>The United States generates more inventions than the rest of the world combined, and Georgia Tech will continue to work with business and government to help turn the nation’s vast innovative capabilities into an American industrial renaissance.</p><p><em>This article originally appeared in the Winter 2012 issue of Research Horizons magazine. Abby Robinson also contributed to this article.</em></p><p><em>Research projects mentioned in this article are supported by sponsors that include the National Science Foundation (NSF) and the Defense Advanced Research Projects Agency (DARPA). Any opinions, findings, conclusions or recommendations expressed in this publication are those of the principal investigators and do not necessarily reflect the views of the NSF or DARPA.&nbsp;</em></p><p><strong>Research News &amp; Publications Office</strong></p><p><strong>Georgia Institute of Technology</strong></p><p><strong>75 Fifth Street, N.W., Suite 314</strong></p><p><strong>Atlanta, Georgia &nbsp;30308 &nbsp;USA</strong></p><p><strong>Media Relations Contacts</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>) or Abby Robinson (404-385-3364)(<a href="mailto:abby@innovate.gatech.edu">abby@innovate.gatech.edu</a>).</p><p><strong>Writer</strong>: Rick Robinson</p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1332940889</created>  <gmt_created>2012-03-28 13:21:29</gmt_created>  <changed>1475896316</changed>  <gmt_changed>2016-10-08 03:11:56</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Innovations being developed at Georgia Tech are improving U.S. manufacturing capabilities.]]></teaser>  <type>news</type>  <sentence><![CDATA[Innovations being developed at Georgia Tech are improving U.S. manufacturing capabilities.]]></sentence>  <summary><![CDATA[<p>Advanced manufacturing is a major area of research at Georgia Tech, involving faculty members from academic colleges, as well as the Georgia Tech Research Institute (GTRI) and the Enterprise Innovation Institute (EI2). Activities focus on a broad range of areas, including new manufacturing technologies, factory-floor issues, manufacturing systems, product improvements and sustainability.</p>]]></summary>  <dateline>2012-03-28T00:00:00-04:00</dateline>  <iso_dateline>2012-03-28T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-03-28 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Advanced manufacturing is a top priority for research programs campuswide]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News &amp; Publications Office</p><p>(404) 894-6986</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>120101</item>          <item>120111</item>          <item>120121</item>          <item>120131</item>          <item>120141</item>      </media>  <hg_media>          <item>          <nid>120101</nid>          <type>image</type>          <title><![CDATA[Custom Wall Structures]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[al-haddad141.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/al-haddad141_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/al-haddad141_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/al-haddad141_1.jpg?itok=fPaNpRJZ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Custom Wall Structures]]></image_alt>                    <created>1449178268</created>          <gmt_created>2015-12-03 21:31:08</gmt_created>          <changed>1475894741</changed>          <gmt_changed>2016-10-08 02:45:41</gmt_changed>      </item>          <item>          <nid>120111</nid>          <type>image</type>          <title><![CDATA[Testing Polymer Materials]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[detter-hoskin50.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/detter-hoskin50_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/detter-hoskin50_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/detter-hoskin50_0.jpg?itok=q4IPd508]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Testing Polymer Materials]]></image_alt>                    <created>1449178268</created>          <gmt_created>2015-12-03 21:31:08</gmt_created>          <changed>1475894741</changed>          <gmt_changed>2016-10-08 02:45:41</gmt_changed>      </item>          <item>          <nid>120121</nid>          <type>image</type>          <title><![CDATA[Maskless Photopolymerization]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[suman-das152.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/suman-das152_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/suman-das152_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/suman-das152_0.jpg?itok=9LrAueXn]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Maskless Photopolymerization]]></image_alt>                    <created>1449178268</created>          <gmt_created>2015-12-03 21:31:08</gmt_created>          <changed>1475894741</changed>          <gmt_changed>2016-10-08 02:45:41</gmt_changed>      </item>          <item>          <nid>120131</nid>          <type>image</type>          <title><![CDATA[Movable Platform]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[christensen-robotics147.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/christensen-robotics147_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/christensen-robotics147_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/christensen-robotics147_0.jpg?itok=k-b7EDyM]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Movable Platform]]></image_alt>                    <created>1449178268</created>          <gmt_created>2015-12-03 21:31:08</gmt_created>          <changed>1475894741</changed>          <gmt_changed>2016-10-08 02:45:41</gmt_changed>      </item>          <item>          <nid>120141</nid>          <type>image</type>          <title><![CDATA[Model-based Systems Engineering]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[mcginnis2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/mcginnis2_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/mcginnis2_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/mcginnis2_0.jpg?itok=0LBDDApS]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Model-based Systems Engineering]]></image_alt>                    <created>1449178268</created>          <gmt_created>2015-12-03 21:31:08</gmt_created>          <changed>1475894741</changed>          <gmt_changed>2016-10-08 02:45:41</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="215"><![CDATA[manufacturing]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>          <term tid="39471"><![CDATA[Materials]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>          <term tid="39541"><![CDATA[Systems]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="70403">  <title><![CDATA[Georgia Tech Researchers Receive Three NSF Emerging Frontiers Awards]]></title>  <uid>27206</uid>  <body><![CDATA[<p>The National Science Foundation (NSF) has awarded $6 million to fund three projects involving researchers from the Georgia Institute of Technology. Each four-year, $2 million grant was awarded through the NSF's Division of Emerging Frontiers in Research and Innovation (EFRI).</p><p>"The EFRI research teams will probe some profound aspects of the interface of biology and engineering," said Sohi Rastegar, director of EFRI. "If they are successful, the principles and theories uncovered in their investigations could unlock many technological opportunities."</p><p>This year, 14 transformative, fundamental research projects were awarded EFRI grants in two emerging areas: technologies that build on understanding of biological signaling, and machines that can interact and cooperate with humans.</p><p>The three Georgia Tech projects include:</p><ul><li>Developing a "therapeutic robot" to help rehabilitate and improve motor skills in people with mobility problems;</li><li>Creating wearable sensors that allow blind people to "see" with their hands, bodies or faces;</li><li>Generating and rigorously testing quantitative models that describe spatial and temporal regulation of cell differentiation in tissues.</li></ul><p>The therapeutic robot could enhance, assist and improve motor skills in humans with varying motor capabilities and deficits. The goal of the project is to program a humanoid rehabilitation robot to perform a "partnered box step," which is a defined pattern of weight shifts and directional changes, solely based on interpreting movement cues from subtle changes in forces between the hands and arms of the robot and the person.</p><p>To do this, researchers at Georgia Tech and Emory University will study how humans use their muscles to walk, balance and generate force signals with the hands for guidance when moving in cooperation with another person. They will also study "rehabilitative partnered dance," which has been specifically adapted to help improve gait and balance in individuals with motor impairments.</p><p>"Our vision is to develop robots that will interact with humans as both assistants and movement therapists," explained principal investigator Lena Ting, an associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University. "We expect our project to have a long-term impact on quality of life of individuals with movement difficulties, such as those caused by Parkinson's disease, stroke and injury by improving fitness, motor skills and social engagement."</p><p>Working with Ting on the project are Emory University School of Medicine (geriatrics) assistant professor Madeleine Hackney, Coulter Department of Biomedical Engineering assistant professor Charlie Kemp and Georgia Tech School of Interactive Computing assistant professor Karen Liu.</p><p>For the second project, researchers at Georgia Tech and The City College of New York will investigate devices for "alternative perception" and the principles underlying the human-machine interaction. Alternative perception combines electronics and the other senses to emulate vision. In addition to aiding the visually impaired, the findings are expected to have other applications, such as the development of intelligent robots.</p><p>The researchers plan to untangle how humans learn to coordinate input from their senses -- e.g. vision, touch -- with movements, like reaching for a glass or moving through a crowded room. They will then map out how machines, such as robots and computers, learn similar tasks, to model devices that can assist humans.</p><p>The team envisions a multifunctional array of sensors on the body and has already developed prototypes for some of the devices. The full complement of wearable sensors would help a sightless person navigate by conveying information about his or her surroundings.</p><p>The researchers hope their findings on perception, and the prototypes they develop, will spawn a raft of wearable electronic devices to help blind people "see" their environment at a distance through touch, hearing and other senses. The technology would also benefit sighted individuals who must navigate in poor visibility, such as firefighters and pilots.</p><p>Principal investigator Zhigang Zhu, professor of computer science and computer engineering in City College's Grove School of Engineering, will collaborate with City College professor of psychology and director of the Program in Cognitive Neuroscience Tony Ro, City College professor of electrical engineering Ying Li Tian, Georgia Tech Woodruff School of Mechanical Engineering professor Kok-Meng Lee, and Georgia Tech School of Applied Physiology associate professor Boris Prilutsky.</p><p>The third project will address a fundamental question of developmental biology: what controls the spatial and temporal patterns of cell differentiation? Answering this question will lead to a better understanding of the basic principles of embryogenesis, explain origins of developmental disorders, and provide guidelines for tissue engineering and regenerative medicine.</p><p>The research will be conducted by principal investigator and Princeton University Department of Chemical and Biological Engineering associate professor Stanislav Shvartsman, Georgia Tech School of Chemical and Biomolecular Engineering associate professor Hang Lu, New York University Department of Biology professor Christine Rushlow, and University of Illinois at Urbana Champaign Department of Computer Science associate professor Saurabh Sinha.</p><p>Scientists know that among an embryo's first major developments is the establishment of its dorsoventral axis, which runs from its back to its belly. The researchers plan to study how this axis development unfolds -- specifically the presence and location of proteins during the process, which give rise to muscle, nerve and skin tissues.</p><p>To enable large-scale quantitative analyses of protein positional information along the dorsoventral axis, Lu and Shvartsman will further develop a microfluidic device they previously designed to reliably and robustly orient several hundred embryos in just a few minutes.</p><p>"By understanding this system at a deeper, quantitative level, we will elucidate general principles underlying the operation of genetic and multicellular networks that drive development," said Lu.</p><p><strong>Research News &amp; Publications Office<br /> Georgia Institute of Technology<br /> 75 Fifth Street, N.W., Suite 314<br /> Atlanta, Georgia 30308 USA</strong></p><p><strong>Media Relations Contacts:</strong> Abby Robinson (abby@innovate.gatech.edu; 404-385-3364) or John Toon (jtoon@gatech.edu; 404-894-6986)</p><p><strong>Writers:</strong> Abby Robinson, Holly Korschun and Jessa Forte Netting</p><p>&nbsp;</p>]]></body>  <author>Abby Vogel Robinson</author>  <status>1</status>  <created>1317254400</created>  <gmt_created>2011-09-29 00:00:00</gmt_created>  <changed>1475896214</changed>  <gmt_changed>2016-10-08 03:10:14</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Three $2 million awards from NSF involve Georgia Tech researchers.]]></teaser>  <type>news</type>  <sentence><![CDATA[Three $2 million awards from NSF involve Georgia Tech researchers.]]></sentence>  <summary><![CDATA[<p>The National Science Foundation has awarded $6 million through its Division of Emerging Frontiers in Research and Innovation to fund three projects involving researchers from the Georgia Institute of Technology.</p>]]></summary>  <dateline>2011-09-29T00:00:00-04:00</dateline>  <iso_dateline>2011-09-29T00:00:00-04:00</iso_dateline>  <gmt_dateline>2011-09-29 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[abby@innovate.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Abby Robinson</strong><br />Research News and Publications<br /><a href="http://www.gatech.edu/contact/index.html?id=avogel6">Contact Abby Robinson</a><br /><strong>404-385-3364</strong></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>70404</item>          <item>70405</item>      </media>  <hg_media>          <item>          <nid>70404</nid>          <type>image</type>          <title><![CDATA[Ting-Kemp-Hackney-Liu]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177314</created>          <gmt_created>2015-12-03 21:15:14</gmt_created>          <changed>1475894618</changed>          <gmt_changed>2016-10-08 02:43:38</gmt_changed>      </item>          <item>          <nid>70405</nid>          <type>image</type>          <title><![CDATA[microfluidic device]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177314</created>          <gmt_created>2015-12-03 21:15:14</gmt_created>          <changed>1475894618</changed>          <gmt_changed>2016-10-08 02:43:38</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.bme.gatech.edu/facultystaff/faculty_record.php?id=37]]></url>        <title><![CDATA[Lena Ting]]></title>      </link>          <link>        <url><![CDATA[http://www.chbe.gatech.edu/faculty/lu.php]]></url>        <title><![CDATA[Hang Lu]]></title>      </link>          <link>        <url><![CDATA[http://www.me.gatech.edu/faculty/lee.shtml]]></url>        <title><![CDATA[Kok-Meng Lee]]></title>      </link>          <link>        <url><![CDATA[http://www.ap.gatech.edu/Prilutsky/]]></url>        <title><![CDATA[Boris Prilutsky]]></title>      </link>          <link>        <url><![CDATA[http://www.bme.gatech.edu/facultystaff/faculty_record.php?id=104]]></url>        <title><![CDATA[Charlie Kemp]]></title>      </link>          <link>        <url><![CDATA[http://www.ic.gatech.edu/people/karen-liu]]></url>        <title><![CDATA[Karen Liu]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="141"><![CDATA[Chemistry and Chemical Engineering]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="141"><![CDATA[Chemistry and Chemical Engineering]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1102"><![CDATA[blind]]></keyword>          <keyword tid="14478"><![CDATA[Boris Prilutsky]]></keyword>          <keyword tid="14480"><![CDATA[cell differentiation]]></keyword>          <keyword tid="2157"><![CDATA[Charlie Kemp]]></keyword>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="594"><![CDATA[college of engineering]]></keyword>          <keyword tid="11533"><![CDATA[Department of Biomedical Engineering]]></keyword>          <keyword tid="898"><![CDATA[Hang Lu]]></keyword>          <keyword tid="2296"><![CDATA[Karen Liu]]></keyword>          <keyword tid="14477"><![CDATA[Kok-Meng Lee]]></keyword>          <keyword tid="2266"><![CDATA[Lena Ting]]></keyword>          <keyword tid="7341"><![CDATA[microfluidic]]></keyword>          <keyword tid="1482"><![CDATA[mobility]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="167863"><![CDATA[School of Applied Physiology]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="69751">  <title><![CDATA[Mini Maker Faire Celebrates DIY on Campus]]></title>  <uid>27469</uid>  <body><![CDATA[<p>Spinning off an idea from <a href="http://makezine.com/">MAKE Magazine</a> and <a href="http://oreilly.com/">O’Reilly Media</a>,a mechanical engineering student will bring the first Atlanta Mini Maker Faireto Georgia Tech’s campus.</p><p>The event — whichcalls itself “a celebration of all things DIY” — will feature the skills and creationsof a variety of makers from the region, including blacksmithing, kineticsculptures, robots and 3D printers. About 50 makers will be in attendance withtheir wares, including many from the Tech community. This smaller version oflarger Maker Faires that have been held in Detroit, New York and California givesthe event its “mini” moniker.</p><p>“I thought Atlanta would be a great place for a Mini MakerFaire because there haven’t really been any in the South before, and I know theSouth is filled with just as many makers and crafters as the rest of thecountry,” said Eric Weinhoffer, the ME student organizing the event. “GeorgiaTech is an extremely good location to host an event like this, thanks to thetechnological advancements that come out of the Institute every year. Theschool itself is an inspiration to makers.”</p><p>The event is free to attend and will welcome students,faculty, staff and guests in the Manufacturing Related Disciplines Complex (MRDC)parking lot, <a href="http://gatech.edu/calendar/event.html?nid=69229">Saturday, Sept. 10</a>, from 10 a.m. to 5 p.m. Most makers will beexhibiting their work, but some will have creations for sale as well. To learnmore about the makers who will be in attendance, visit the <a href="http://www.makerfaireatl.com/Atlanta_Mini_Maker_Faire/Home.html">Atlanta Mini MakerFaire website</a>.</p>]]></body>  <author>Kristen Bailey</author>  <status>1</status>  <created>1314867321</created>  <gmt_created>2011-09-01 08:55:21</gmt_created>  <changed>1475896205</changed>  <gmt_changed>2016-10-08 03:10:05</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The first Atlanta Mini Maker Faire will take place on Georgia Tech’s campus.]]></teaser>  <type>news</type>  <sentence><![CDATA[The first Atlanta Mini Maker Faire will take place on Georgia Tech’s campus.]]></sentence>  <summary><![CDATA[<p>The first Atlanta Mini Maker Faire will take place on Georgia Tech’s campus.</p>]]></summary>  <dateline>2011-09-01T00:00:00-04:00</dateline>  <iso_dateline>2011-09-01T00:00:00-04:00</iso_dateline>  <gmt_dateline>2011-09-01 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:eweinhoffer@gmail.com">Eric Weinhoffer<br /></a>Atlanta Mini Maker Faire</p><p><a href="mailto:kristen.shaw@comm.gatech.edu">Kristen Shaw<br /></a>Communications and Marketing&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>69230</item>      </media>  <hg_media>          <item>          <nid>69230</nid>          <type>image</type>          <title><![CDATA[Atlanta Mini Maker Faire Logo]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[atlanta_minimf.jpeg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/atlanta_minimf_0.jpeg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/atlanta_minimf_0.jpeg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/atlanta_minimf_0.jpeg?itok=335kMeyR]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Atlanta Mini Maker Faire Logo]]></image_alt>                    <created>1449177239</created>          <gmt_created>2015-12-03 21:13:59</gmt_created>          <changed>1475894606</changed>          <gmt_changed>2016-10-08 02:43:26</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.makerfaireatl.com/Atlanta_Mini_Maker_Faire/Home.html]]></url>        <title><![CDATA[Atlanta Mini Maker Faire]]></title>      </link>          <link>        <url><![CDATA[internal:/!/AtlMakerFaire]]></url>        <title><![CDATA[Atlanta Mini Maker Faire on Twitter]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="130"><![CDATA[Alumni]]></category>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>          <category tid="143"><![CDATA[Digital Media and Entertainment]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="130"><![CDATA[Alumni]]></term>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="134"><![CDATA[Student and Faculty]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>          <term tid="143"><![CDATA[Digital Media and Entertainment]]></term>          <term tid="148"><![CDATA[Music and Music Technology]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="14181"><![CDATA[ammf]]></keyword>          <keyword tid="13945"><![CDATA[atlanta mini maker faire]]></keyword>          <keyword tid="541"><![CDATA[Mechanical Engineering]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="66196">  <title><![CDATA[Team Robot: Autonomous Vehicles Collaborate to Explore, Map Buildings]]></title>  <uid>27303</uid>  <body><![CDATA[<p>There isn't a radio-control handset in sight as several small robots roll briskly up the hallways of an office building.  Working by themselves and communicating only with one another, the vehicles divide up a variety of exploration tasks -- and within minutes have transmitted a detailed floor map to humans nearby. </p><p>This isn't a future-tech scenario.  This advanced autonomous capability has been developed by a team from the Georgia Institute of Technology, the University of Pennsylvania and the California Institute of Technology/Jet Propulsion Laboratory (JPL).  A paper describing this capability and its present level of performance was presented in April at the SPIE Defense, Security and Sensing Conference in Orlando, Fla. </p><p>"When first responders -- whether it's a firefighter in downtown Atlanta or a soldier overseas -- confront an unfamiliar structure, it's very stressful and potentially dangerous because they have limited knowledge of what they're dealing with," said Henrik Christensen, a team member who is a professor in the Georgia Tech College of Computing and director of the Robotics and Intelligent Machines Center there.  "If those first responders could send in robots that would quickly search the structure and send back a map, they'd have a much better sense of what to expect and they'd feel more confident."</p><p>The ability to map and explore simultaneously represents a milestone in the Micro Autonomous Systems and Technology (MAST) Collaborative Technology Alliance Program, a major research initiative sponsored by the U.S. Army Research Laboratory. The five-year program is led by BAE Systems and includes numerous principal and general members comprised largely of universities.</p><p>MAST's ultimate objective is to develop technologies that will enable palm-sized autonomous robots to help humans deal with civilian and military challenges in confined spaces.  The program vision is for collaborative teams of tiny devices that could roll, hop, crawl or fly just about anywhere, carrying sensors that detect and send back information critical to human operators.</p><p>The wheeled platforms used in this experiment measure about one foot square. But MAST researchers are working toward platforms small enough to be held in the palm of one hand. Fully autonomous and collaborative, these tiny robots could swarm by the scores into hazardous situations.</p><p>The MAST program involves four principal research teams: integration, microelectronics, microsystems mechanics, and processing for autonomous operation. Georgia Tech researchers are participating in every area except microelectronics. In addition to the College of Computing, researchers from the Georgia Tech Research Institute (GTRI), the School of Aerospace Engineering and the School of Physics are involved in MAST work. </p><p>The experiment -- developed by the Georgia Tech MAST processing team -- combines navigation technology developed by Georgia Tech with vision-based techniques from JPL and network technology from the University of Pennsylvania.  </p><p>In addition to Christensen, members of the Georgia Tech processing team involved in the demonstration include Professor Frank Dellaert of the College of Computing and graduate students Alex Cunningham, Manohar Paluri and John G. Rogers III.   Regents professor Ronald C. Arkin of the College of Computing and Tom Collins of GTRI are also members of the Georgia Tech processing team.</p><p>In the experiment, the robots perform their mapping work using two types of sensors – a video camera and a laser scanner.  Supported by onboard computing capability, the camera locates doorways and windows, while the scanner measures walls.  In addition, an inertial measurement unit helps stabilize the robot and provides information about its movement.</p><p>Data from the sensors are integrated into a local area map that is developed by each robot using a graph-based technique called simultaneous localization and mapping (SLAM). The SLAM approach allows an autonomous vehicle to develop a map of either known or unknown environments, while also monitoring and reporting on its own current location.</p><p>SLAM's flexibility is especially valuable in areas where global positioning system (GPS) service is blocked, such as inside buildings and in some combat zones, Christensen said.  When GPS is active, human handlers can use it to see where their robots are. But in the absence of global location information, SLAM enables the robots to keep track of their own locations as they move.</p><p>"There is no lead robot, yet each unit is capable of recruiting other units to make sure the entire area is explored," Christensen explained. "When the first robot comes to an intersection, it says to a second robot, 'I'm going to go to the left if you go to the right.'" </p><p>Christensen expects the robots' abilities to expand beyond mapping soon. One capability under development by a MAST team involves tiny radar units that could see through walls and detect objects -- or humans -- behind them.  Infrared sensors could also support the search mission by locating anything giving off heat.  In addition, a MAST team is developing a highly flexible "whisker" to sense the proximity of walls, even in the dark. </p><p>The processing team is designing a more complex experiment for the coming year to include small autonomous aerial platforms for locating a particular building, finding likely entry points and then calling in robotic mapping teams. Demonstrating such a capability next year would culminate progress in small-scale autonomy during MAST's first five years, Christensen said.</p><p>In addition to the three universities, other MAST team participants are North Carolina A&amp;T State University, the University of California Berkeley, the University of Maryland, the University of Michigan, the University of New Mexico, Harvard University, the Massachusetts Institute of Technology, and two companies: BAE Systems and Daedalus Flight Systems.</p><p><strong><em>This research was sponsored by the Army Research Laboratory under Cooperative Agreement Number W911NF-08-2-0004. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Army Research Laboratory or the U.S. Government.</em></strong> </p><p><strong>Research News &amp; Publications Office<br />Georgia Institute of Technology<br />75 Fifth Street, N.W., Suite 314<br />Atlanta, Georgia  30308  USA</strong></p><p><strong></strong></p><p><strong>Media Relations Contacts</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>) or Abby Robinson (404-385-3364)(<a href="mailto:abby@innovate.gatech.edu">abby@innovate.gatech.edu</a>).</p><p><strong>Writer</strong>: Rick Robinson</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1305417600</created>  <gmt_created>2011-05-15 00:00:00</gmt_created>  <changed>1475896125</changed>  <gmt_changed>2016-10-08 03:08:45</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Autonomous robots are collaborating to explore and map buildings.]]></teaser>  <type>news</type>  <sentence><![CDATA[Autonomous robots are collaborating to explore and map buildings.]]></sentence>  <summary><![CDATA[<p>In a project sponsored by the Army Research Laboratory, researchers are giving autonomous robots the ability to work together to explore and map the interiors of buildings. Beyond soldiers, the capability could also help civilian first responders.</p>]]></summary>  <dateline>2011-05-15T00:00:00-04:00</dateline>  <iso_dateline>2011-05-15T00:00:00-04:00</iso_dateline>  <gmt_dateline>2011-05-15 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>John Toon</strong><br />Research News &amp; Publications Office<br /><a href="http://www.gatech.edu/contact/index.html?id=jt7">Contact John Toon</a><br /><strong>404-894-6986</strong></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>66198</item>          <item>66199</item>      </media>  <hg_media>          <item>          <nid>66198</nid>          <type>image</type>          <title><![CDATA[Henrik Christensen with robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449176931</created>          <gmt_created>2015-12-03 21:08:51</gmt_created>          <changed>1475894587</changed>          <gmt_changed>2016-10-08 02:43:07</gmt_changed>      </item>          <item>          <nid>66199</nid>          <type>image</type>          <title><![CDATA[Henrik Christensen with robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449176931</created>          <gmt_created>2015-12-03 21:08:51</gmt_created>          <changed>1475894587</changed>          <gmt_changed>2016-10-08 02:43:07</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.ic.gatech.edu/people/henrik-christensen]]></url>        <title><![CDATA[Henrik Christensen]]></title>      </link>          <link>        <url><![CDATA[http://www.rim.gatech.edu/]]></url>        <title><![CDATA[Robotics and Intelligent Machine Center]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/]]></url>        <title><![CDATA[College of Computing]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="7264"><![CDATA[autonomous]]></keyword>          <keyword tid="3156"><![CDATA[Buildings]]></keyword>          <keyword tid="10939"><![CDATA[collaborate]]></keyword>          <keyword tid="7059"><![CDATA[explore]]></keyword>          <keyword tid="11890"><![CDATA[henrik christensen]]></keyword>          <keyword tid="7076"><![CDATA[map]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="60881">  <title><![CDATA[Researchers Give Robots the Capability for Deceptive Behavior]]></title>  <uid>27206</uid>  <body><![CDATA[<p>A robot deceives an enemy soldier by creating a false trail and hiding so that it will not be caught. While this sounds like a scene from one of the Terminator movies, it's actually the scenario of an experiment conducted by researchers at the Georgia Institute of Technology as part of what is believed to be the first detailed examination of robot deception.</p><p>"We have developed algorithms that allow a robot to determine whether it should deceive a human or other intelligent machine and we have designed techniques that help the robot select the best deceptive strategy to reduce its chance of being discovered," said Ronald Arkin, a Regents professor in the Georgia Tech School of Interactive Computing. </p><p>The results of robot experiments and theoretical and cognitive deception modeling were published online on Sept. 3 in the <em>International Journal of Social Robotics</em>. Because the researchers explored the phenomena of robot deception from a general perspective, the study's results apply to robot-robot and human-robot interactions. This research was funded by the Office of Naval Research.</p><p>In the future, robots capable of deception may be valuable for several different areas, including military and search and rescue operations. A search and rescue robot may need to deceive in order to calm or receive cooperation from a panicking victim. Robots on the battlefield with the power of deception will be able to successfully hide and mislead the enemy to keep themselves and valuable information safe. </p><p>"Most social robots will probably rarely use deception, but it's still an important tool in the robot's interactive arsenal because robots that recognize the need for deception have advantages in terms of outcome compared to robots that do not recognize the need for deception," said the study's co-author, Alan Wagner, a research engineer at the Georgia Tech Research Institute.</p><p>For this study, the researchers focused on the actions, beliefs and communications of a robot attempting to hide from another robot to develop programs that successfully produced deceptive behavior. Their first step was to teach the deceiving robot how to recognize a situation that warranted the use of deception. Wagner and Arkin used interdependence theory and game theory to develop algorithms that tested the value of deception in a specific situation. A situation had to satisfy two key conditions to warrant deception -- there must be conflict between the deceiving robot and the seeker, and the deceiver must benefit from the deception. </p><p>Once a situation was deemed to warrant deception, the robot carried out a deceptive act by providing a false communication to benefit itself. The technique developed by the Georgia Tech researchers based a robot's deceptive action selection on its understanding of the individual robot it was attempting to deceive.</p><p>To test their algorithms, the researchers ran 20 hide-and-seek experiments with two autonomous robots. Colored markers were lined up along three potential pathways to locations where the robot could hide. The hider robot randomly selected a hiding location from the three location choices and moved toward that location, knocking down colored markers along the way. Once it reached a point past the markers, the robot changed course and hid in one of the other two locations. The presence or absence of standing markers indicated the hider's location to the seeker robot.</p><p>"The hider's set of false communications was defined by selecting a pattern of knocked over markers that indicated a false hiding position in an attempt to say, for example, that it was going to the right and then actually go to the left," explained Wagner.</p><p>The hider robots were able to deceive the seeker robots in 75 percent of the trials, with the failed experiments resulting from the hiding robot’s inability to knock over the correct markers to produce the desired deceptive communication.</p><p>"The experimental results weren't perfect, but they demonstrated the learning and use of deception signals by real robots in a noisy environment," said Wagner. "The results were also a preliminary indication that the techniques and algorithms described in the paper could be used to successfully produce deceptive behavior in a robot."</p><p>While there may be advantages to creating robots with the capacity for deception, there are also ethical implications that need to be considered to ensure that these creations are consistent with the overall expectations and well-being of society, according to the researchers.</p><p>"We have been concerned from the very beginning with the ethical implications related to the creation of robots capable of deception and we understand that there are beneficial and deleterious aspects," explained Arkin. "We strongly encourage discussion about the appropriateness of deceptive robots to determine what, if any, regulations or guidelines should constrain the development of these systems."</p><p><em>This work was funded by Grant No. N00014-08-1-0696 from the Office of Naval Research (ONR). The content is solely the responsibility of the principal investigator and does not necessarily represent the official view of ONR.</em></p><p><strong>Research News &amp; Publications Office<br />Georgia Institute of Technology<br />75 Fifth Street, N.W., Suite 314<br />Atlanta, Georgia  30308  USA</strong></p><p><strong>Media Relations Contacts:</strong> Abby Vogel Robinson (abby@innovate.gatech.edu; 404-385-3364) or John Toon (jtoon@gatech.edu; 404-894-6986)</p><p><strong>Writer:</strong> Abby Vogel Robinson</p>]]></body>  <author>Abby Vogel Robinson</author>  <status>1</status>  <created>1283990400</created>  <gmt_created>2010-09-09 00:00:00</gmt_created>  <changed>1475896039</changed>  <gmt_changed>2016-10-08 03:07:19</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers publish first detailed examination of robot deceptio]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers publish first detailed examination of robot deceptio]]></sentence>  <summary><![CDATA[Georgia Tech researchers have published the first detailed examination of robot deception. They developed algorithms that allow a robot to determine whether it should deceive, and help the robot select the best deceptive strategy to avoid getting caught.]]></summary>  <dateline>2010-09-09T00:00:00-04:00</dateline>  <iso_dateline>2010-09-09T00:00:00-04:00</iso_dateline>  <gmt_dateline>2010-09-09 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[abby@innovate.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>Abby Vogel Robinson</strong><br />Research News and Publications<br /><a href="http://www.gatech.edu/contact/index.html?id=avogel6">Contact Abby Vogel Robinson</a><br /><strong>404-385-3364</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>60882</item>          <item>60883</item>          <item>60884</item>      </media>  <hg_media>          <item>          <nid>60882</nid>          <type>image</type>          <title><![CDATA[Deceptive robots]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tjs39795.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tjs39795_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tjs39795_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tjs39795_0.jpg?itok=Onz5cxid]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Deceptive robots]]></image_alt>                    <created>1449176296</created>          <gmt_created>2015-12-03 20:58:16</gmt_created>          <changed>1475894528</changed>          <gmt_changed>2016-10-08 02:42:08</gmt_changed>      </item>          <item>          <nid>60883</nid>          <type>image</type>          <title><![CDATA[Ronald Arkin and Alan Wagner]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ttm39795.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ttm39795_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ttm39795_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ttm39795_0.jpg?itok=IIDfwqbg]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ronald Arkin and Alan Wagner]]></image_alt>                    <created>1449176296</created>          <gmt_created>2015-12-03 20:58:16</gmt_created>          <changed>1475894531</changed>          <gmt_changed>2016-10-08 02:42:11</gmt_changed>      </item>          <item>          <nid>60884</nid>          <type>image</type>          <title><![CDATA[Research on deceptive robots]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tqs39795.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tqs39795_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tqs39795_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tqs39795_0.jpg?itok=HfdYjiNQ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Research on deceptive robots]]></image_alt>                    <created>1449176296</created>          <gmt_created>2015-12-03 20:58:16</gmt_created>          <changed>1475894531</changed>          <gmt_changed>2016-10-08 02:42:11</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://dx.doi.org/10.1007/s12369-010-0073-8]]></url>        <title><![CDATA[International Journal of Social Robotics paper]]></title>      </link>          <link>        <url><![CDATA[http://www.ic.gatech.edu/people/ronald-arkin]]></url>        <title><![CDATA[Ronald Arkin]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/~alanwags/]]></url>        <title><![CDATA[Alan Wagner]]></title>      </link>          <link>        <url><![CDATA[http://www.gtri.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech Research Institute]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/]]></url>        <title><![CDATA[College of Computing]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="5660"><![CDATA[algorithms]]></keyword>          <keyword tid="10604"><![CDATA[Deception]]></keyword>          <keyword tid="10610"><![CDATA[deceptive communication]]></keyword>          <keyword tid="10609"><![CDATA[false communication]]></keyword>          <keyword tid="10605"><![CDATA[Hiding]]></keyword>          <keyword tid="525"><![CDATA[military]]></keyword>          <keyword tid="10606"><![CDATA[Military Operations]]></keyword>          <keyword tid="10607"><![CDATA[Reconnaissance]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="10608"><![CDATA[robot communication]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="168894"><![CDATA[search and rescue]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="71617">  <title><![CDATA[Urban Challenge Run Ends at Qualifying Event]]></title>  <uid>27303</uid>  <body><![CDATA[<p>The blue Porsche Cayenne pulls up to a four-way intersection and stops. After it continues through the junction, it approaches a vehicle stopped in its lane. The Cayenne checks to make sure there are no cars approaching in the opposing lane, passes the stopped car and returns to its original lane. </p><p>This scene may sound normal, but this is no ordinary Porsche Cayenne-it thinks for itself and requires no driver. This autonomous vehicle was designed by the Georgia Institute of Technology in collaboration with Science Applications International Corporation (SAIC) for the Defense Advanced Research Projects Agency's (DARPA) Urban Challenge.</p><p>Georgia Tech's vehicle, named Sting 1, did not qualify for the final challenge during the National Qualifying Event (NQE) held from October 26-31 at the urban military training facility located on the former George Air Force Base in Victorville, California. Sting 1 finished as one of 35 teams that made it to the NQE.</p><p>"As a first-time entrant, the team has done an outstanding job making it to the semifinal round of the world's most challenging robotics competition," said Tucker Balch, team lead and associate professor in Georgia Tech's School of Interactive Computing in the College of Computing.</p><p>With six cameras, eight computers, Doppler radar and infrared laser radar on board, Sting 1 was designed to operate without any human intervention and obey California traffic laws while performing maneuvers such as merging into moving traffic, navigating traffic circles and avoiding moving obstacles.</p><p>The road to California began in the summer of 2006, when Georgia Tech and 88 other teams signed up to participate in this year's Urban Challenge.</p><p>"Georgia Tech didn't compete in the two previous Grand Challenges, but SAIC did," added Balch. "Their experience helped us develop software that could have enabled a robot to place well in the previous challenges and then we took it further with additional capabilities necessary for the Urban Challenge."</p><p>The Georgia Tech team, consisting of researchers in Georgia Tech's College of Computing and College of Engineering and the Georgia Tech Research Institute (GTRI), chose the Porsche Cayenne as their vehicle and in August 2006 began to install computers that would drive the car automatically. </p><p>Eight computers networked together through two high speed networks were programmed to know the rules of the road. This included knowing how to stay in a lane, how to overtake another car, how to make turns in city traffic, how to maneuver the waiting patterns at an intersection, how to merge into traffic and how to behave in a parking lot. </p><p>According to the racing team, the car really had to think for itself. </p><p>"When moving forward, the car usually ignored obstacles that were in its planned path," said Tom Collins, electronics lead and GTRI principal research engineer. "But when obstacles were detected, the car would plan and execute a different route."</p><p>SAIC engineers developed methods for visual lane detection and tracking. On unpaved dirt roads, the colors of the road and non-road areas were modeled to identify a path, adapting over time as lighting or surface colors changed. On marked paved roads, a camera kept the car in its lane by detecting the typical white and yellow lines that mark a driving lane. If the vision system was unable to find a lane, the car used lasers to follow the curb. Ten laser range finders sent out infrared laser beams that constantly scanned to provide Sting 1 with an accurate measurement of the distance to any objects, such as curbs and other cars.</p><p>At intersections, the team used laser and radar sensors to see other waiting or approaching vehicles. Six off-the-shelf Doppler radar systems used to detect moving objects allowed the car to see as far as two football fields away in all directions. Cameras helped guide the car through the intersections and onto new roadways.</p><p>"We had to guarantee that there was at least a 10 second window that would allow us to pull out onto a road, accelerate and get up to a reasonable speed without cutting someone off," noted Henrik Christensen, principal investigator for the team and director of Georgia Tech's Robotics and Intelligent Machines Center.</p><p>The researchers tested their car for months in the parking lot behind the Centergy One building in Technology Square on the Georgia Tech campus. They also utilized the Georgia Public Safety Training Center in Forsyth, Ga. on weekends to test the ability of the car to maneuver in an urban environment. </p><p>The Urban Challenge is the third in a series of DARPA-sponsored competitions to foster the development of robotic ground vehicle technology without a human operator, designed for use on the battlefield. Safe operation in traffic is essential to U.S. military plans to use autonomous ground vehicles to conduct important missions and keep American personnel out of harm's way.</p><p>Georgia Tech researchers are already thinking about life after the Urban Challenge.</p><p>"We've already talked about expanding this work to other areas," said Vince Camp, hardware lead and GTRI senior research engineer. "We're looking forward to using the technologies in applications such as autonomous lane striping for the Department of Transportation."</p><p>Challenges like this also aim to improve safety in vehicles consumers purchase. Some high-end vehicles sold today have backup sensors that alert the driver to obstacles and can parallel park without driver assistance. There are also systems that will alert a driver that is approaching a car in the same lane too quickly or if a driver is leaving the appropriate lane.</p><p>"These types of systems will help us become better drivers, but it's probably going to be a decade or so before we see fully autonomous vehicles," said Christensen. "At some point, though, drivers will realize that their cars are probably much more aware of what's going on around the car and are better equipped to deal with a situation than human drivers."</p><p>DARPA awarded a first-place prize of $2 million to Carnegie Mellon's Tartan Racing Team.  Second and third places went to teams from Stanford Univesity and Virginia Tech.</p><p><strong>Research News &amp; Publications Office<br />Georgia Institute of Technology<br />75 Fifth Street, N.W., Suite 100<br />Atlanta, Georgia  30308  USA</strong></p><p><strong>Media Relations Contacts</strong>: Stefany Wilson, College of Computing (404-894-7253); E-mail: (<a href="mailto:stefany@cc.gatech.edu">stefany@cc.gatech.edu</a>) or Abby Vogel, Research News &amp; Publications Office (404-385-3364); E-mail: (<a href="mailto:avogel@gatech.edu">avogel@gatech.edu</a>) or Kirk Englehardt, Georgia Tech Research Institute (404-407-7280); E-mail: (<a href="mailto:kirk.englehardt@gtri.gatech.edu">kirk.englehardt@gtri.gatech.edu</a>).</p><p><strong>Writer</strong>: Abby Vogel</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1194310800</created>  <gmt_created>2007-11-06 01:00:00</gmt_created>  <changed>1475895804</changed>  <gmt_changed>2016-10-08 03:03:24</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Sting Racing Team reaches competition semifinals]]></teaser>  <type>news</type>  <sentence><![CDATA[Sting Racing Team reaches competition semifinals]]></sentence>  <summary><![CDATA[The Sting Racing Team sponsored by Georgia Tech and SAIC reached the semifinals of the Defense Advanced Research Projects Agency's Urban Challenge, but did not quality for the final challenge.]]></summary>  <dateline>2007-11-06T00:00:00-05:00</dateline>  <iso_dateline>2007-11-06T00:00:00-05:00</iso_dateline>  <gmt_dateline>2007-11-06 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Georgia Tech/SAIC Sting 1 vehicle reaches semifinals]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[stefany@cc.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>Stefany Wilson</strong><br />College of Computing<br /><a href="http://www.gatech.edu/contact/index.html?id=sw187">Contact Stefany Wilson</a><br /><strong>404-894-7253</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>71618</item>          <item>71619</item>          <item>71620</item>      </media>  <hg_media>          <item>          <nid>71618</nid>          <type>image</type>          <title><![CDATA[Sting1 vehicle]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177396</created>          <gmt_created>2015-12-03 21:16:36</gmt_created>          <changed>1475894639</changed>          <gmt_changed>2016-10-08 02:43:59</gmt_changed>      </item>          <item>          <nid>71619</nid>          <type>image</type>          <title><![CDATA[Sting Racing Team]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177396</created>          <gmt_created>2015-12-03 21:16:36</gmt_created>          <changed>1475894639</changed>          <gmt_changed>2016-10-08 02:43:59</gmt_changed>      </item>          <item>          <nid>71620</nid>          <type>image</type>          <title><![CDATA[Sting 1 Vehicle]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177396</created>          <gmt_created>2015-12-03 21:16:36</gmt_created>          <changed>1475894639</changed>          <gmt_changed>2016-10-08 02:43:59</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.gtri.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech Research Institute]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/]]></url>        <title><![CDATA[College of Computing]]></title>      </link>          <link>        <url><![CDATA[http://www.sting-racing.org/]]></url>        <title><![CDATA[Sting Racing Web site]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="690"><![CDATA[darpa]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="170760"><![CDATA[Sting]]></keyword>          <keyword tid="1249"><![CDATA[vehicle]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="46384">  <title><![CDATA[Researchers Learn Why Robots Get Stuck in the Sand]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Today</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1234141200</created>  <gmt_created>2009-02-09 01:00:00</gmt_created>  <changed>1475895799</changed>  <gmt_changed>2016-10-08 03:03:19</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new study provides details of robot travel on granular surface]]></teaser>  <type>news</type>  <sentence><![CDATA[A new study provides details of robot travel on granular surface]]></sentence>  <summary><![CDATA[A new study takes what may be the first detailed look at the problem of robot locomotion on granular surfaces. Among the study]]></summary>  <dateline>2009-02-09T00:00:00-05:00</dateline>  <iso_dateline>2009-02-09T00:00:00-05:00</iso_dateline>  <gmt_dateline>2009-02-09 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[New Study Could Help Future Space Robots]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[avogel@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>Abby Vogel</strong><br />Research News and Publications<br /><a href="http://www.gatech.edu/contact/index.html?id=avogel6">Contact Abby Vogel</a><br /><strong>404-385-3364</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>46385</item>          <item>46386</item>      </media>  <hg_media>          <item>          <nid>46385</nid>          <type>image</type>          <title><![CDATA[SandBot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[txc17406.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/txc17406_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/txc17406_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/txc17406_0.jpg?itok=kYVX-noL]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[SandBot]]></image_alt>                    <created>1449174428</created>          <gmt_created>2015-12-03 20:27:08</gmt_created>          <changed>1475894419</changed>          <gmt_changed>2016-10-08 02:40:19</gmt_changed>      </item>          <item>          <nid>46386</nid>          <type>image</type>          <title><![CDATA[SandBot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tih17406.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tih17406_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tih17406_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tih17406_0.jpg?itok=oCAF_Gdq]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[SandBot]]></image_alt>                    <created>1449174428</created>          <gmt_created>2015-12-03 20:27:08</gmt_created>          <changed>1475894419</changed>          <gmt_changed>2016-10-08 02:40:19</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.gtresearchnews.gatech.edu/movies/SandBot.wmv]]></url>        <title><![CDATA[Video of SandBot (wmv format)]]></title>      </link>          <link>        <url><![CDATA[http://www.physics.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech School of Physics]]></title>      </link>          <link>        <url><![CDATA[http://www.physics.gatech.edu/people/faculty/dgoldman.html]]></url>        <title><![CDATA[Daniel Goldman]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1357"><![CDATA[granular]]></keyword>          <keyword tid="377"><![CDATA[locomotion]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="169242"><![CDATA[sand]]></keyword>          <keyword tid="1359"><![CDATA[terrain]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="71098">  <title><![CDATA[GTRI Wins Contract to Support Test & Evaluation of Unmanned Systems]]></title>  <uid>27303</uid>  <body><![CDATA[<p>The Georgia Tech Research Institute (GTRI) has won a contract to support development of a roadmap designed to improve the testing and evaluation of unmanned and autonomous systems for the U.S. Office of the Secretary of Defense (OSD).</p><p>"The field of unmanned and autonomous systems is evolving rapidly, and new techniques are needed to effectively test and evaluate the capabilities that are being inserted into these systems. This is especially challenging for systems that are increasing in levels of autonomy," said Lora Weiss, a GTRI principal research engineer.  "Our task is to develop a roadmap that identifies new approaches to testing autonomous systems and details what needs to be tested, how the autonomous technologies can be tested, and when the testing needs to occur."</p><p>Known as the Roadmap Development and Technology Insertion Plan (RD-TIP), the one-year $430,000 award is funded through the U.S. Army at White Sands Missile Range.  The initiative is headed by Derrick Hinton, T&amp;E/S&amp;T program manager with the Test Resources Management Center in the U.S. Department of Defense.  </p><p>"Many new technologies are being developed for unmanned and autonomous systems that must be tested and evaluated before they can be deployed.  New approaches are needed for testing and measuring the robustness of these systems, especially in non-deterministic and evolving environments," Weiss noted.  "The only way to know how to test them is to understand both the details of the technology and the system that it is going into. GTRI has extensive experience in both areas and can uniquely couple fundamental research with warfighter systems."</p><p>The effort will address all five major unmanned and autonomous systems domains, including systems that operate in the air, on the ground, underwater, on the sea surface and in space.  The roadmap will address both vehicles and the socio-technical environments in which they operate. </p><p>"There is a strong desire from the warfighter to get these systems into the field," Weiss added.  "This, coupled with the rapid pace at which unmanned and autonomous systems are developing, creates a need to consider new options for more flexible testing of unmanned systems.  Through this roadmap, the government has asked us to help define these options."</p><p>Test and evaluation has traditionally been a focus area for GTRI, noted Rusty Roberts, a principal research engineer who oversees all of GTRI's test and evaluation programs. "The current roadmap award builds on GTRI's long-term experience with test and evaluation for government customers and couples it with GTRI's strong knowledge of unmanned systems," he said.</p><p>The unmanned systems test and evaluation project is a new area within the Test and Evaluation Science and Technology Program, which is sponsored by the Test Resource Management Center (TRMC) within the Office of the Secretary of Defense. </p><p>GTRI has ongoing projects in four areas of the T&amp;E Science and Technology Program: unmanned and autonomous systems, directed energy, net-centric systems and non-intrusive instrumentation.</p><p>The applied research arm of the Georgia Institute of Technology, GTRI is also involved in other test and evaluation projects for the government, Roberts said.  Its test and evaluation capabilities cover a broad range of engineering and scientific disciplines, including tracking new technologies and their effect on test and evaluation, planning and executing programs for the government's operational test agencies and providing and/or sponsoring test and evaluation professional education courses and workshops, as well as meetings such the annual ITEA Technology Conference.  </p><p>Unmanned and autonomous systems are recognized as critical components to all aspects of modern warfare across the joint forces, and they are growing in mission effectiveness. They have proved effective in Afghanistan and Iraq by providing commanders at both the operational and tactical levels with improved intelligence, surveillance, reconnaissance, and precision strike capabilities. </p><p>"They are being chosen over manned systems when the situation involves the dull (long mission times), the dirty (sampling for hazardous materials) and the dangerous (lethal exposure to hostile action) -- and when the unmanned systems can provide capabilities that are not achievable by manned systems," Weiss noted. </p><p><strong>Research News &amp; Publications Office<br />Georgia Institute of Technology<br />75 Fifth Street, N.W., Suite 100<br />Atlanta, Georgia  30308  USA</strong></p><p><strong>Media Relations Contacts</strong>: John Toon (404-894-6986); E-mail: (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>) or Kirk Englehardt (404-407-7280); E-mail: (<a href="mailto:kirk.englehardt@gtri.gatech.edu">kirk.englehardt@gtri.gatech.edu</a>).</p><p><strong>Writer</strong>: Rick Robinson</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1217462400</created>  <gmt_created>2008-07-31 00:00:00</gmt_created>  <changed>1475895799</changed>  <gmt_changed>2016-10-08 03:03:19</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Research will provide a technology 'roadmap' for testing]]></teaser>  <type>news</type>  <sentence><![CDATA[Research will provide a technology 'roadmap' for testing]]></sentence>  <summary><![CDATA[The Georgia Tech Research Institute (GTRI) has won a contract to support development of a roadmap designed to improve the testing and evaluation of unmanned and autonomous systems for the U.S. Office of the Secretary of Defense (OSD).]]></summary>  <dateline>2008-07-31T00:00:00-04:00</dateline>  <iso_dateline>2008-07-31T00:00:00-04:00</iso_dateline>  <gmt_dateline>2008-07-31 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>John Toon</strong><br />Research News &amp; Publications Office<br /><a href="http://www.gatech.edu/contact/index.html?id=jt7">Contact John Toon</a><br /><strong>404-894-6986</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>71099</item>          <item>71100</item>      </media>  <hg_media>          <item>          <nid>71099</nid>          <type>image</type>          <title><![CDATA[UAV testing]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177348</created>          <gmt_created>2015-12-03 21:15:48</gmt_created>          <changed>1475894628</changed>          <gmt_changed>2016-10-08 02:43:48</gmt_changed>      </item>          <item>          <nid>71100</nid>          <type>image</type>          <title><![CDATA[UAV testing]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177348</created>          <gmt_created>2015-12-03 21:15:48</gmt_created>          <changed>1475894628</changed>          <gmt_changed>2016-10-08 02:43:48</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.gtri.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech Research Institute]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="7264"><![CDATA[autonomous]]></keyword>          <keyword tid="1331"><![CDATA[evaluation]]></keyword>          <keyword tid="383"><![CDATA[test]]></keyword>          <keyword tid="1500"><![CDATA[UAV]]></keyword>          <keyword tid="7263"><![CDATA[unmanned]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="71164">  <title><![CDATA[Tongue-controlled System Assists Individuals with Disabilities]]></title>  <uid>27206</uid>  <body><![CDATA[<p>A new assistive technology developed by engineers at the Georgia Institute of Technology could help individuals with severe disabilities lead more independent lives.</p><p>The novel system allows individuals with disabilities to operate a computer, control a powered wheelchair and interact with their environments simply by moving their tongues.</p><p>"This device could revolutionize the field of assistive technologies by helping individuals with severe disabilities, such as those with high-level spinal cord injuries, return to rich, active, independent and productive lives," said Maysam Ghovanloo, an assistant professor in the Georgia Tech School of Electrical and Computer Engineering. Ghovanloo developed the system with graduate student Xueliang Huo.</p><p>The tongue-operated assistive technology, called the Tongue Drive system, was described on June 29 at the 2008 Rehabilitation Engineering and Assistive Technology Society of North America (RESNA) Annual Conference in Washington, D.C. An article about this system is also scheduled to appear in an upcoming issue of the <em>Journal of Rehabilitation Research and Development</em>. This research was funded by the National Science Foundation and the Christopher and Dana Reeve Foundation.</p><p><em><strong>- Watch a video of Ghovanloo describing the Tongue Drive system and its applications <a href='http://gtresearchnews.gatech.edu/movies/tongue-drive.mov'>here.</a><br />- Watch a video of Huo operating a powered wheelchair with the Tongue Drive system <a href='http://www.gtresearchnews.gatech.edu/movies/wheelchair.mov'>here.</a> </strong></em></p><p>To operate the Tongue Drive system, potential users only need to be able to move their tongues. Attaching a small magnet, the size of a grain of rice, to an individual's tongue by implantation, piercing or tissue adhesive allows tongue motion to direct the movement of a cursor across a computer screen or a powered wheelchair around a room.</p><p>"We chose the tongue to operate the system because unlike hands and feet, which are controlled by the brain through the spinal cord, the tongue is directly connected to the brain by a cranial nerve that generally escapes damage in severe spinal cord injuries or neuromuscular diseases," said Ghovanloo, who started working on this project about three years ago at North Carolina State University. "Tongue movements are also fast, accurate and do not require much thinking, concentration or effort."</p><p>Movement of the magnetic tracer attached to the tongue is detected by an array of magnetic field sensors mounted on a headset outside the mouth or on an orthodontic brace inside the mouth. The sensor output signals are wirelessly transmitted to a portable computer, which can be carried on the user's clothing or wheelchair.</p><p>The sensor output signals are processed to determine the relative motion of the magnet with respect to the array of sensors in real-time. This information is then used to control the movements of a cursor on the computer screen or to substitute for the joystick function in a powered wheelchair.</p><p>The system can potentially capture a large number of tongue movements, each of which can represent a different user command. A unique set of specific tongue movements can be tailored for each individual based on the user's abilities, oral anatomy, personal preferences and lifestyle.</p><p>"An individual could potentially train our system to recognize touching each tooth as a different command," explained Ghovanloo. "The ability to train our system with as many commands as an individual can comfortably remember is a significant advantage over the common sip-n-puff device that acts as a simple switch controlled by sucking or blowing through a straw."</p><p>The Tongue Drive system is also non-invasive and does not require brain surgery like some of the brain-computer interface technologies.</p><p>Ghovanloo's group recently completed trials in which six able-bodied individuals tested the Tongue Drive system. Each participant defined six tongue commands that would substitute for computer mouse tasks - left, right, up and down pointer movements and single- and double-click. For each trial, the individual began by training the system. During the five-minute training session, the individual repeated each of the six designated tongue movements 10 times.</p><p>During the testing session, the user moved his or her tongue to one of the predefined command positions and the mouse pointer started moving in the selected direction. To move the cursor faster, users could hold their tongue in the position of the issued command to gradually accelerate the pointer until it reached a maximum velocity.</p><p>Results of the computer access test by novice users with the current Tongue Drive prototype showed a response time of less than one second with almost 100 percent accuracy for the six individual commands. This is equivalent to an information transfer rate of approximately 150 bits per minute, which is much faster than the bandwidth of most brain-computer interfaces, according to Ghovanloo.</p><p>The researchers have also tested the ability of twelve able-bodied individuals to operate an electric-powered wheelchair with the Tongue Drive system. The next step is to test and assess the usability and acceptability of the system by people with severe disabilities, said Ghovanloo. He is teaming with the Shepherd Center, an Atlanta-based catastrophic care hospital, and the Georgia Tech Center for Assistive Technology and Environmental Access, to conduct those trials.</p><p>The research team has also begun to develop software to connect the Tongue Drive system to a wide variety of readily available communication tools such as text generators, speech synthesizers and readers. In addition, the researchers plan to add control commands, such as switching the system into standby mode to permit the user to eat, sleep or engage in a conversation while extending battery life.</p><p>"We hope this technology will reduce the need of individuals with severe disabilities to receive continuous assistance from family members or caregivers, thus significantly reducing healthcare and assistance costs," noted Ghovanloo. "This system may also make it easier for them to work and communicate with others, such as friends and family."</p><p><strong>Research News &amp; Publications Office<br />Georgia Institute of Technology<br />75 Fifth Street, N.W., Suite 100<br />Atlanta, Georgia  30308  USA</strong></p><p>Media Relations Contacts: Abby Vogel (404-385-3364); E-mail: (<a href="mailto:avogel@gatech.edu">avogel@gatech.edu</a>) or John Toon (404-894-6986); E-mail: (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).</p><p><strong>Technical Contact:</strong> Maysam Ghovanloo (404-385-7048); E-mail: (<a href="mailto:mgh@gatech.edu">mgh@gatech.edu</a>)</p><p><strong>Writer:</strong> Abby Vogel</p>]]></body>  <author>Abby Vogel Robinson</author>  <status>1</status>  <created>1214697600</created>  <gmt_created>2008-06-29 00:00:00</gmt_created>  <changed>1475895799</changed>  <gmt_changed>2016-10-08 03:03:19</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Tongue drive system assists persons with disabilities.]]></teaser>  <type>news</type>  <sentence><![CDATA[Tongue drive system assists persons with disabilities.]]></sentence>  <summary><![CDATA[A new assistive technology allows individuals with disabilities to operate a computer, control a powered wheelchair and interact with their environments simply by moving their tongues. The Tongue Drive system, developed by engineers at the Georgia Institute of Technology, could help individuals with severe disabilities lead more independent lives.]]></summary>  <dateline>2008-06-30T00:00:00-04:00</dateline>  <iso_dateline>2008-06-30T00:00:00-04:00</iso_dateline>  <gmt_dateline>2008-06-30 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[System allows them to operate powered wheelchairs and computers]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[abby@innovate.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>Abby Robinson</strong><br />Research News and Publications<br /><a href="http://www.gatech.edu/contact/index.html?id=avogel6">Contact Abby Robinson</a><br /><strong>404-385-3364</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>71165</item>          <item>71166</item>          <item>71167</item>      </media>  <hg_media>          <item>          <nid>71165</nid>          <type>image</type>          <title><![CDATA[Tongue Drive computer monitor]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177348</created>          <gmt_created>2015-12-03 21:15:48</gmt_created>          <changed>1475894630</changed>          <gmt_changed>2016-10-08 02:43:50</gmt_changed>      </item>          <item>          <nid>71166</nid>          <type>image</type>          <title><![CDATA[Tongue Drive wheelchair]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177348</created>          <gmt_created>2015-12-03 21:15:48</gmt_created>          <changed>1475894630</changed>          <gmt_changed>2016-10-08 02:43:50</gmt_changed>      </item>          <item>          <nid>71167</nid>          <type>image</type>          <title><![CDATA[Tongue Drive]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177348</created>          <gmt_created>2015-12-03 21:15:48</gmt_created>          <changed>1475894630</changed>          <gmt_changed>2016-10-08 02:43:50</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.ece.gatech.edu/index.html]]></url>        <title><![CDATA[School of Electrical and Computer Engineering]]></title>      </link>          <link>        <url><![CDATA[http://www.ece.gatech.edu/faculty-staff/fac_profiles/bio.php?id=147]]></url>        <title><![CDATA[Maysam Ghovanloo]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="2652"><![CDATA[assistive]]></keyword>          <keyword tid="1912"><![CDATA[brain]]></keyword>          <keyword tid="3748"><![CDATA[communication]]></keyword>          <keyword tid="439"><![CDATA[computer]]></keyword>          <keyword tid="7134"><![CDATA[cord]]></keyword>          <keyword tid="242"><![CDATA[disabilities]]></keyword>          <keyword tid="359"><![CDATA[disability]]></keyword>          <keyword tid="5378"><![CDATA[Electric]]></keyword>          <keyword tid="521"><![CDATA[injury]]></keyword>          <keyword tid="2815"><![CDATA[interface]]></keyword>          <keyword tid="7132"><![CDATA[magnet]]></keyword>          <keyword tid="7324"><![CDATA[mouse]]></keyword>          <keyword tid="7325"><![CDATA[neuromuscular]]></keyword>          <keyword tid="3517"><![CDATA[power]]></keyword>          <keyword tid="554"><![CDATA[rehabilitation]]></keyword>          <keyword tid="167318"><![CDATA[sensor]]></keyword>          <keyword tid="170869"><![CDATA[sip-n-puff]]></keyword>          <keyword tid="170848"><![CDATA[spinal]]></keyword>          <keyword tid="623"><![CDATA[Technology]]></keyword>          <keyword tid="7130"><![CDATA[tongue]]></keyword>          <keyword tid="1652"><![CDATA[wheelchair]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="46266">  <title><![CDATA[Study Reveals Sandfish Tucks Legs to Slither Like Snake Through Sand]]></title>  <uid>27206</uid>  <body><![CDATA[<p>A study published in the July 17 issue of the journal <em>Science</em> details how sandfish -- small lizards with smooth scales -- move rapidly underground through desert sand. In this first thorough examination of subsurface sandfish locomotion, researchers from the Georgia Institute of Technology found that the animals place their limbs against their sides and create a wave motion with their bodies to propel themselves through granular media.</p><p>"When started above the surface, the animals dive into the sand within a half second. Once below the surface, they no longer use their limbs for propulsion -- instead, they move forward by propagating a traveling wave down their bodies like a snake," said study leader Daniel Goldman, an assistant professor in Georgia Tech's School of Physics.</p><p>With funding from the National Science Foundation and the Burroughs Wellcome Fund, the research team used high-speed X-ray imaging to visualize sandfish -- formally called <em>Scincus scincus </em>-- burrowing into and through sand. The team used that information to develop a physics model of the lizard's locomotion.</p><p>The sandfish used in this study inhabits the Sahara desert in Africa and is approximately four inches long. It uses its long, wedge-shaped snout and countersunk lower jaw to rapidly bury into and swim within sand. The sandfish's body has flattened sides and is covered with smooth shiny scales, its legs are short and sturdy with long and flattened fringed toes and its tail tapers to a fine point.</p><ul><strong><em><li>Watch a video of a sandfish using its limbs to run on the surface and rapidly bury into the interior of granular media <a href="http://www.gtresearchnews.gatech.edu/movies/1172490s1.mov"> here</a>. </li><li>Watch a video of a sandfish slither like a snake through granular media <a href="http://www.gtresearchnews.gatech.edu/movies/1172490s2.mov"> here</a>.</li><li>Watch a video of a sandfish swim through granular media with opaque markers on its body that clearly show that its limbs are held close to its body during swimming <a href="http://www.gtresearchnews.gatech.edu/movies/1172490s3.mov"> here</a>.</li><p> </p></em></strong></ul><p>To conduct controlled experiments with the sandfish, Goldman and graduate students Ryan Maladen, Yang Ding and Chen Li built a seven-inch by eight-inch by four-inch-deep glass bead-filled container with tiny holes in the bottom through which air could be blown. The air pulses elevated the beads and caused them to settle into a loosely packed solid state. Repeated pulses of air compacted the material, allowing the researchers to closely control the density of the material. </p><p>Since a sandfish might encounter and need to move through different densities of sand in the desert, the researchers tested whether sandfish locomotion changed when burrowing through media with volume fractions of 58 and 62 percent -- typical values for desert sand. </p><p>"Since loosely packed media is easier to push through and closely packed is harder to push through, we thought there should be some difference in the sandfish's locomotion," said Goldman. "But the results surprised us because the density of the granular media did not affect how the sandfish traveled through the sand; it was always the same undulatory wavelike pattern." </p><p>For a given wave frequency, the swimming speed depended only on the frequency of the wave and not on the density. Unexpectedly though, the animals could swim a bit faster in closely packed material by using a higher frequency range. The team also varied the diameter of the glass beads, but still observed similar wavelike motion. </p><p>By tracking the sandfish in the X-ray images as it swam through the glass beads, Goldman was able to characterize the sandfish's motion -- called its kinematics -- as the form of a single-period sinusoidal wave that traveled from the head to the tail. </p><p>"The large amplitude waves over the entire body are unlike the kinematics of other undulatory swimming organisms that are the same size as the sandfish, like eels, which propagate waves that start with a small amplitude that gets larger toward the tail," explained Goldman. </p><p>After collecting the experimental data, Goldman's team developed a physics model to predict the speed at which sandfish swim through sand. The model was inspired by the resistive force theory, which allowed the researchers to partition the body of the sandfish into segments, each of which generated thrust and experienced drag when moving through the granular environment. </p><p>"When you balance the thrust and drag, you get motion at some velocity, but we needed to determine the forces on the animal segments because we don't have the appropriate equations for drag force during movement through granular media," explained Goldman.</p><p>To establish these equations, the researchers measured the granular thrust and drag forces on a small stainless steel cylindrical rod, thus allowing them to predict the wave efficiency and optimal kinematics. They found that the faster the sandfish propagate the wave, the faster they move forward through granular media -- up to speeds of six inches per second. This speed allows the animal to escape predators, the heat of the desert surface and quickly swim to ambush surface prey they detect from vibrations. </p><p>"The results demonstrate that burrowing and swimming in complex media like sand can have intricacy similar to that of movement in air or water, and that organisms can exploit the solid and fluid-like properties of these media to move effectively within them," noted Goldman.</p><p>In addition to having a biological impact, this study's results also have ecological significance, according to Goldman. Understanding the mechanics of subsurface movement could reveal how the actions of small burrowing organisms like worms, scorpions, snakes and lizards can transform landscapes by their burrowing actions. This research may also help engineers build sandfish-like robots that can travel through complex environments.</p><p>"If something nasty was buried in unconsolidated material, such as rubble, debris or sand, and you wanted to find it, you would need a device that could scamper on the surface, but also swim underneath the surface," Goldman said. "Since our work aims to fundamentally understand how the best animals in nature move in these complex unstructured environments, it could be very valuable information for this type of research."</p><p><em>This material is based upon work supported by the National Science Foundation (NSF) under Award No. PHY-0749991 and the Burroughs Wellcome Fund. Any opinions, findings, conclusions or recommendations expressed in this publication are those of the researcher and do not necessarily reflect the views of the NSF.</em></p><p><strong>Research News &amp; Publications Office<br />Georgia Institute of Technology<br />75 Fifth Street, N.W., Suite 100<br />Atlanta, Georgia  30308  USA</strong></p><p>Media Relations Contacts: Abby Vogel (404-385-3364); E-mail: (<a href="mailto:avogel@gatech.edu">avogel@gatech.edu</a>) or John Toon (404-894-6986); E-mail: (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>)</p><p><strong>Technical Contact:</strong> Daniel Goldman (404-894-0993); E-mail: (<a href="mailto:daniel.goldman@physics.gatech.edu">daniel.goldman@physics.gatech.edu</a>) </p><p><strong>Writer:</strong> Abby Vogel</p>]]></body>  <author>Abby Vogel Robinson</author>  <status>1</status>  <created>1247702400</created>  <gmt_created>2009-07-16 00:00:00</gmt_created>  <changed>1475895794</changed>  <gmt_changed>2016-10-08 03:03:14</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Study shows how small lizards move rapidly underground through s]]></teaser>  <type>news</type>  <sentence><![CDATA[Study shows how small lizards move rapidly underground through s]]></sentence>  <summary><![CDATA[In the first thorough examination of subsurface sandfish locomotion, researchers found that the small lizards place their limbs against their sides and create a wave motion like snakes to propel themselves through granular media.]]></summary>  <dateline>2009-07-16T00:00:00-04:00</dateline>  <iso_dateline>2009-07-16T00:00:00-04:00</iso_dateline>  <gmt_dateline>2009-07-16 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[avogel@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>Abby Vogel</strong><br />Research News and Publications<br /><a href="http://www.gatech.edu/contact/index.html?id=avogel6">Contact Abby Vogel</a><br /><strong>404-385-3364</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>46267</item>          <item>46268</item>          <item>46269</item>      </media>  <hg_media>          <item>          <nid>46267</nid>          <type>image</type>          <title><![CDATA[Sandfish lizard]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tjw66159.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tjw66159_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tjw66159_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tjw66159_0.jpg?itok=hd-AyIMq]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sandfish lizard]]></image_alt>                    <created>1449174375</created>          <gmt_created>2015-12-03 20:26:15</gmt_created>          <changed>1475894414</changed>          <gmt_changed>2016-10-08 02:40:14</gmt_changed>      </item>          <item>          <nid>46268</nid>          <type>image</type>          <title><![CDATA[Dan Goldman scincus scincus]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tpd66160.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tpd66160_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tpd66160_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tpd66160_0.jpg?itok=bZ9Gi4je]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Dan Goldman scincus scincus]]></image_alt>                    <created>1449174375</created>          <gmt_created>2015-12-03 20:26:15</gmt_created>          <changed>1475894414</changed>          <gmt_changed>2016-10-08 02:40:14</gmt_changed>      </item>          <item>          <nid>46269</nid>          <type>image</type>          <title><![CDATA[Dan Goldman sandfish]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tbc66160.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tbc66160_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tbc66160_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tbc66160_0.jpg?itok=bBddDMJ7]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Dan Goldman sandfish]]></image_alt>                    <created>1449174375</created>          <gmt_created>2015-12-03 20:26:15</gmt_created>          <changed>1475894414</changed>          <gmt_changed>2016-10-08 02:40:14</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.physics.gatech.edu/research/goldman/]]></url>        <title><![CDATA[Daniel Goldman]]></title>      </link>          <link>        <url><![CDATA[http://www.physics.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech School of Physics]]></title>      </link>          <link>        <url><![CDATA[http://dx.doi.org/10.1126/science.1172490]]></url>        <title><![CDATA[Science article]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="7118"><![CDATA[desert]]></keyword>          <keyword tid="7123"><![CDATA[drag]]></keyword>          <keyword tid="987"><![CDATA[imaging]]></keyword>          <keyword tid="7121"><![CDATA[kinematics]]></keyword>          <keyword tid="7116"><![CDATA[lizard]]></keyword>          <keyword tid="377"><![CDATA[locomotion]]></keyword>          <keyword tid="1383"><![CDATA[model]]></keyword>          <keyword tid="960"><![CDATA[physics]]></keyword>          <keyword tid="169242"><![CDATA[sand]]></keyword>          <keyword tid="169581"><![CDATA[sandfish]]></keyword>          <keyword tid="170845"><![CDATA[scincus]]></keyword>          <keyword tid="170846"><![CDATA[skink]]></keyword>          <keyword tid="170847"><![CDATA[slither]]></keyword>          <keyword tid="169001"><![CDATA[Snake]]></keyword>          <keyword tid="7122"><![CDATA[thrust]]></keyword>          <keyword tid="7119"><![CDATA[undulation]]></keyword>          <keyword tid="7120"><![CDATA[wave]]></keyword>          <keyword tid="1448"><![CDATA[x-ray]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="46280">  <title><![CDATA[Clinical Trial Shows That Quadriplegics Can Use Tongue Drive System]]></title>  <uid>27206</uid>  <body><![CDATA[<p>An assistive technology that enables individuals to maneuver a powered wheelchair or control a mouse cursor using simple tongue movements can be operated by individuals with high-level spinal cord injuries, according to the results of a recently completed clinical trial.</p><p>"This clinical trial has validated that the Tongue Drive system is intuitive and quite simple for individuals with high-level spinal cord injuries to use," said Maysam Ghovanloo, an assistant professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology. "Trial participants were able to easily remember and correctly issue tongue commands to play computer games and drive a powered wheelchair around an obstacle course with very little prior training."</p><p>At the annual conference of the Rehabilitation Engineering and Assistive Technology Society of North America (RESNA) on June 26, the researchers reported the results of the first five clinical trial subjects to use the Tongue Drive system. The trial was conducted at the Shepherd Center, an Atlanta-based catastrophic care hospital, and funded by the National Science Foundation and the Christopher and Dana Reeve Foundation.</p><p>The clinical trial tested the ability of these individuals with tetraplegia, as a result of high-level spinal cord injuries (cervical vertebrae C3-C5), to perform tasks related to computer access and wheelchair navigation -- using only their tongue movements. </p><p>At the beginning of each trial, Ghovanloo and graduate students Xueliang Huo and Chih-wen Cheng attached a small magnet -- the size of a grain of rice -- to the participant's tongue with tissue adhesive. Movement of this magnetic tracer was detected by an array of magnetic field sensors mounted on wireless headphones worn by the subject. The sensor output signals were wirelessly transmitted to a portable computer, which was carried on the wheelchair.</p><p>The signals were processed to determine the relative motion of the magnet with respect to the array of sensors in real-time. This information was then used to control the movements of the cursor on a computer screen or to substitute for the joystick function in a powered wheelchair. Details on use of the Tongue Drive for wheeled mobility were published in the June 2009 issue of the journal <em>IEEE Transactions on Biomedical Engineering</em>.</p><p>Ghovanloo chose the tongue to operate the system because unlike hands and feet, which are controlled by the brain through the spinal cord, the tongue is directly connected to the brain by a cranial nerve that generally escapes damage in severe spinal cord injuries or neuromuscular diseases.</p><p>Before using the Tongue Drive system, the subjects trained the computer to understand how they would like to move their tongues to indicate different commands. A unique set of specific tongue movements was tailored for each individual based on the user's abilities, oral anatomy and personal preferences. For the first computer test, the user issued commands to move the computer mouse left and right. Using these commands, each subject played a computer game that required moving a paddle horizontally to prevent a ball from hitting the bottom of the screen. </p><p>After adding two more commands to their repertoire -- up and down -- the subjects were asked to move the mouse cursor through an on-screen maze as quickly and accurately as possible.</p><p>Then the researchers added two more commands -- single and double mouse clicks -- to provide the subject with complete mouse functionality. When a randomly selected symbol representing one of the six commands appeared on the computer screen, the subject was instructed to issue that command within a specified time period. Each subject completed 40 trials for each time period.</p><p>After the computer sessions, the subjects were ready for the wheelchair driving exercise. Using forward, backward, right, left and stop/neutral tongue commands, the subjects maneuvered a powered wheelchair through an obstacle course. </p><p>The obstacle course contained 10 turns and was longer than a professional basketball court. Throughout the course, the users had to perform navigation tasks such as making a U-turn, backing up and fine-tuning the direction of the wheelchair in a limited space. Subjects were asked to navigate through the course as fast as they could, while avoiding collisions. </p><p>Each subject operated the powered wheelchair using two different control strategies: discrete mode, which was designed for novice users, and continuous mode for more experienced users. In discrete mode, if the user issued the command to move forward and then wanted to turn right, the user would have to stop the wheelchair before issuing the command to turn right. The stop command was selected automatically when the tongue returned to its resting position, bringing the wheelchair to a standstill.</p><p>"Discrete mode is a safety feature particularly for novice users, but it reduces the agility of the wheelchair movement," explained Ghovanloo. "In continuous mode, however, the user is allowed to steer the powered wheelchair to the left or right as it is moving forward and backward, thus making it possible to follow a curve."</p><p>Each subject completed the course at least twice using each strategy while the researchers recorded the navigation time and number of collisions. Using discrete control, the average speed for the five subjects was 5.2 meters per minute and the average number of collisions was 1.8. Using continuous control, the average speed was 7.7 meters per minute and the average number of collisions was 2.5.</p><p>While this initial performance trial only required six tongue commands, the Tongue Drive system can potentially capture a large number of tongue movements, each of which can represent a different user command. The ability to train the system with as many commands as an individual can comfortably remember and having all of the commands available to the user at the same time are significant advantages over the common sip-n-puff device that acts as a simple switch controlled by sucking or blowing through a straw. </p><p>Some sip-n-puff users also consider the straw to be a symbol of their disability. Since Tongue Drive users simply wear headphones that are commonly worn to listen to music, the system is more acceptable to potential users.</p><p>John Anschutz, manager of the assistive technology program at the Shepherd Center, identified advantages the Tongue Drive system has over the tongue-touch keypad.</p><p>"The Tongue Drive system seems to be much more supportable if there were a failure of some component within the system. With the old tongue-touch keypad, if the system went down then the user lost all of the functions of the wheelchair, phone, computer and environmental control," explained Anschutz. "Ghovanloo's approach should be much more repairable should a fault arise, which is critical for systems for which so much function is depended upon."  </p><p>A future system upgrade will be to move the sensors inside the user's mouth, according to Ghovanloo. This will be an important step for users who are very impaired and cannot reposition the system for best results, according to Anschutz. </p><p>"All of the subjects successfully completed the computer and powered wheelchair navigation tasks with their tongues without difficulty, which demonstrates that the Tongue Drive system can potentially provide individuals unable to move their arms and hands with effective control over a wide variety of devices they use in their daily lives," said Ghovanloo.</p><p><strong>Research News &amp; Publications Office<br />Georgia Institute of Technology<br />75 Fifth Street, N.W., Suite 100<br />Atlanta, Georgia  30308  USA</strong></p><p>Media Relations Contacts: Abby Vogel (404-385-3364); E-mail: (<a href="mailto:avogel@gatech.edu">avogel@gatech.edu</a>) or John Toon (404-894-6986); E-mail: (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).</p><p><strong>Technical Contact:</strong> Maysam Ghovanloo (404-385-7048); E-mail: (<a href="mailto:mgh@gatech.edu">mgh@gatech.edu</a>)</p><p><strong>Writer:</strong> Abby Vogel</p>]]></body>  <author>Abby Vogel Robinson</author>  <status>1</status>  <created>1246838400</created>  <gmt_created>2009-07-06 00:00:00</gmt_created>  <changed>1475895794</changed>  <gmt_changed>2016-10-08 03:03:14</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Clinical trial shows tongue drive system assists disabled.]]></teaser>  <type>news</type>  <sentence><![CDATA[Clinical trial shows tongue drive system assists disabled.]]></sentence>  <summary><![CDATA[An assistive technology that enables individuals to maneuver a powered wheelchair or control a mouse cursor using simple tongue movements can be operated by individuals with high-level spinal cord injuries, according to the results of a recently completed clinical trial.]]></summary>  <dateline>2009-07-06T00:00:00-04:00</dateline>  <iso_dateline>2009-07-06T00:00:00-04:00</iso_dateline>  <gmt_dateline>2009-07-06 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Study Participants Used System to Operate Powered Wheelchair and Computer]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[avogel@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>Abby Vogel</strong><br />Research News and Publications<br /><a href="http://www.gatech.edu/contact/index.html?id=avogel6">Contact Abby Vogel</a><br /><strong>404-385-3364</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>46281</item>          <item>46282</item>          <item>46283</item>      </media>  <hg_media>          <item>          <nid>46281</nid>          <type>image</type>          <title><![CDATA[Cruise Bogle - wheelchair obstacle course]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ttd83741.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ttd83741_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ttd83741_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ttd83741_0.jpg?itok=LayQ06fU]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Cruise Bogle - wheelchair obstacle course]]></image_alt>                    <created>1449174375</created>          <gmt_created>2015-12-03 20:26:15</gmt_created>          <changed>1475894414</changed>          <gmt_changed>2016-10-08 02:40:14</gmt_changed>      </item>          <item>          <nid>46282</nid>          <type>image</type>          <title><![CDATA[Cruise Bogle training session]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tze83742.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tze83742_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tze83742_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tze83742_0.jpg?itok=96vegcag]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Cruise Bogle training session]]></image_alt>                    <created>1449174375</created>          <gmt_created>2015-12-03 20:26:15</gmt_created>          <changed>1475894414</changed>          <gmt_changed>2016-10-08 02:40:14</gmt_changed>      </item>          <item>          <nid>46283</nid>          <type>image</type>          <title><![CDATA[Cruise Bogle and GT researchers]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[txn83742.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/txn83742_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/txn83742_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/txn83742_0.jpg?itok=u_Pt7laj]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Cruise Bogle and GT researchers]]></image_alt>                    <created>1449174375</created>          <gmt_created>2015-12-03 20:26:15</gmt_created>          <changed>1475894414</changed>          <gmt_changed>2016-10-08 02:40:14</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.ece.gatech.edu/faculty-staff/fac_profiles/bio.php?id=147]]></url>        <title><![CDATA[Maysam Ghovanloo]]></title>      </link>          <link>        <url><![CDATA[http://www.ece.gatech.edu/]]></url>        <title><![CDATA[School of Electrical and Computer Engineering]]></title>      </link>          <link>        <url><![CDATA[http://dx.doi.org/10.1109/TBME.2009.2018632]]></url>        <title><![CDATA[IEEE Transactions on Biomedical Engineering paper]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="2652"><![CDATA[assistive]]></keyword>          <keyword tid="439"><![CDATA[computer]]></keyword>          <keyword tid="7134"><![CDATA[cord]]></keyword>          <keyword tid="2646"><![CDATA[disabled]]></keyword>          <keyword tid="521"><![CDATA[injury]]></keyword>          <keyword tid="7132"><![CDATA[magnet]]></keyword>          <keyword tid="7131"><![CDATA[quadriplegic]]></keyword>          <keyword tid="167318"><![CDATA[sensor]]></keyword>          <keyword tid="170848"><![CDATA[spinal]]></keyword>          <keyword tid="623"><![CDATA[Technology]]></keyword>          <keyword tid="7135"><![CDATA[tetraplegia]]></keyword>          <keyword tid="7130"><![CDATA[tongue]]></keyword>          <keyword tid="1652"><![CDATA[wheelchair]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="46327">  <title><![CDATA[McMurray Tapped to Lead GTRI?s Food Processing Technology Division]]></title>  <uid>27206</uid>  <body><![CDATA[<p>Gary McMurray, a long-time research engineer with the Georgia Tech Research Institute (GTRI), has been appointed chief of GTRI's Food Processing Technology Division, succeeding Craig Wyvill, who retired in April.</p><p>McMurray brings to his new position two decades of experience designing and building advanced robotic systems for the food, transportation and biomedical industries.</p><p>"Gary has the vision to diversify our revenues and expand our critical Agricultural Technology Research Program (ATRP), which is one of the major activities within the Food Processing Technology Division," said Rusty Roberts, director of the Aerospace, Transportation and Advanced Systems (ATAS) Laboratory, which oversees the division.</p><p>Ranked as one of the top programs of its kind in the country, ATRP works closely with Georgia agribusiness, especially the poultry industry, to develop new technologies and adapt existing ones for specialized industrial needs. Researchers focus efforts on both immediate and long-term industrial needs, ranging from advanced robotic systems to improved wastewater treatment technologies to machine-vision grading and rapid microbial detection. </p><p>McMurray currently leads a project to develop a "smart" deboning system. The system uses computer vision and other sensing technologies to recognize and react to size and shape differences of a carcass to perform precision cuts that optimize yield (the amount of meat removed from the bone) while reducing the risk of bone fragments in finished product.</p><p>The Food Processing Technology Division also conducts significant industrial research under Georgia's Traditional Industries Program for Food Processing, which is managed through the Food Processing Advisory Council (FoodPAC). FoodPAC enhances the competitiveness of Georgia's food industry, and through the Traditional Industries Program, has helped GTRI to commercialize some of its developments while also adapting them to the needs of such industries as bakeries and fruit processors.</p><p>While food processing technologies remain the division's research priority, funding from the Georgia Department of Transportation has allowed researchers to develop technologies for the transportation industry. For one project, GTRI researchers developed a system capable of automatically placing reflective pavement markers along highway lane stripes from a moving truck.</p><p>Since division researchers have core expertise in automation, information technology, food safety, worker safety and environmental technology, McMurray plans to further expand the division's research focuses into areas including biomedical devices, unmanned and autonomous systems, and biofuels.</p><p>"We are mechanical engineers, electrical engineers, software engineers, image processing experts and many of our core competencies transfer very nicely into areas outside of food processing," said McMurray.</p><p>McMurray has personally initiated collaborations with physicians at Emory University to develop new technology to support doctors performing minimally invasive procedures and add new functionality to these procedures. </p><p>He is currently developing a new breed of endoscope -- the medical devices used to inspect spaces inside the body -- that will allow doctors to focus their attention on inspecting the space rather than manipulating the medical device. For colonoscopies, doctors must currently guide a specialized endoscope through the patient's colon by pushing the endoscope and controlling the orientation of the instrument's tip while simultaneously watching a video monitor that displays images captured by the endoscope's camera. </p><p>Division researchers are also collaborating with other ATAS researchers to develop and test unmanned and autonomous systems. These systems are recognized as critical components to all aspects of modern warfare across the joint forces, and they are growing in mission effectiveness. </p><p>In addition to leading the division's research efforts, McMurray will also lead a $3 million fundraising campaign to expand the 36,000-square-foot Food Processing Technology Building by an extra 10,000 square feet. Bettcher Industries, Inc., a world leader in designing and manufacturing food processing equipment and cutting tools, was the first company to support the construction with a donation of $125,000.</p><p>"While the building holds facilities to conduct research in automation technology, information technology and environmental systems, it's not large enough for our food safety, human factors and bioprocessing research," explained McMurray.</p><p>McMurray earned his bachelor's and master's degrees in mechanical engineering from Georgia Tech in 1985 and 1987, respectively. He lives in Smyrna with his wife Stephanie -- also a Georgia Tech graduate -- and sons Ben, 7, and Alex, 5.</p><p><strong>Research News &amp; Publications Office<br />Georgia Institute of Technology<br />75 Fifth Street, N.W., Suite 100<br />Atlanta, Georgia  30308  USA</strong></p><p>Media Relations Contacts: Abby Vogel (404-385-3364); E-mail: (<a href="mailto:avogel@gatech.edu">avogel@gatech.edu</a>); Kirk Englehardt (404-407-7280); E-mail: (<a href="mailto:kirkeng@gatech.edu">kirkeng@gatech.edu</a>); or John Toon (404-894-6986); E-mail: (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).</p><p><strong>Writer:</strong> Abby Vogel</p>]]></body>  <author>Abby Vogel Robinson</author>  <status>1</status>  <created>1242086400</created>  <gmt_created>2009-05-12 00:00:00</gmt_created>  <changed>1475895794</changed>  <gmt_changed>2016-10-08 03:03:14</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[McMurray lead GTRI's Food Processing Technology Division]]></teaser>  <type>news</type>  <sentence><![CDATA[McMurray lead GTRI's Food Processing Technology Division]]></sentence>  <summary><![CDATA[Gary McMurray, a long-time research engineer with the Georgia Tech Research Institute (GTRI), has been appointed chief of GTRI's Food Processing Technology Division, succeeding Craig Wyvill, who retired in April.]]></summary>  <dateline>2009-05-14T00:00:00-04:00</dateline>  <iso_dateline>2009-05-14T00:00:00-04:00</iso_dateline>  <gmt_dateline>2009-05-14 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[McMurray Spent Two Decades Designing and Building Advanced Robotic Systems for the Food, Transportation and Biomedical Industries]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[avogel@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>Abby Vogel</strong><br />Research News and Publications<br /><a href="http://www.gatech.edu/contact/index.html?id=avogel6">Contact Abby Vogel</a><br /><strong>404-385-3364</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>46328</item>          <item>46329</item>          <item>46330</item>      </media>  <hg_media>          <item>          <nid>46328</nid>          <type>image</type>          <title><![CDATA[Gary McMurray]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tbl35227.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tbl35227_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tbl35227_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tbl35227_0.jpg?itok=lVS2KDM5]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Gary McMurray]]></image_alt>                    <created>1449174401</created>          <gmt_created>2015-12-03 20:26:41</gmt_created>          <changed>1475894416</changed>          <gmt_changed>2016-10-08 02:40:16</gmt_changed>      </item>          <item>          <nid>46329</nid>          <type>image</type>          <title><![CDATA[Gary McMurray]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tuh36582.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tuh36582_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tuh36582_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tuh36582_0.jpg?itok=JB53OHD1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Gary McMurray]]></image_alt>                    <created>1449174401</created>          <gmt_created>2015-12-03 20:26:41</gmt_created>          <changed>1475894416</changed>          <gmt_changed>2016-10-08 02:40:16</gmt_changed>      </item>          <item>          <nid>46330</nid>          <type>image</type>          <title><![CDATA[Gary McMurray endoscope]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tza36670.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tza36670_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/tza36670_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tza36670_0.jpg?itok=XlAM_35y]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Gary McMurray endoscope]]></image_alt>                    <created>1449174401</created>          <gmt_created>2015-12-03 20:26:41</gmt_created>          <changed>1475894416</changed>          <gmt_changed>2016-10-08 02:40:16</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://foodtech.gatech.edu/]]></url>        <title><![CDATA[GTRI Food Processing Technology Division]]></title>      </link>          <link>        <url><![CDATA[http://www.gtri.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech Research Institute]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="130"><![CDATA[Alumni]]></category>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="132"><![CDATA[Institute Leadership]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="130"><![CDATA[Alumni]]></term>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="132"><![CDATA[Institute Leadership]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="669"><![CDATA[agriculture]]></keyword>          <keyword tid="670"><![CDATA[atrp]]></keyword>          <keyword tid="116"><![CDATA[food]]></keyword>          <keyword tid="671"><![CDATA[foodpac]]></keyword>          <keyword tid="665"><![CDATA[gary]]></keyword>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="666"><![CDATA[mcmurray]]></keyword>          <keyword tid="668"><![CDATA[poultry]]></keyword>          <keyword tid="195"><![CDATA[processing]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="168"><![CDATA[Transportation]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="389951">  <title><![CDATA[Snake robots learn to turn by following the lead of real sidewinders]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Researchers at Carnegie Mellon University (CMU) who develop snake-like robots have picked up a few tricks from real sidewinder rattlesnakes on how to make rapid and even sharp turns with their undulating, modular device.</p><p>Working with colleagues at the Georgia Institute of Technology and Zoo Atlanta, they have analyzed the motions of sidewinders and tested their observations on CMU’s snake robots. They showed how the complex motion of a sidewinder can be described in terms of two wave motions – vertical and horizontal body waves – and how changing the phase and amplitude of the waves enables snakes to achieve exceptional maneuverability.</p><p>“We’ve been programming snake robots for years and have figured out how to get these robots to crawl amidst rubble and through or around pipes,” said Howie Choset, professor at CMU’s Robotics Institute. “By learning from real sidewinders, however, we can make these maneuvers much more efficient and simplify user control. This makes our modular robots much more valuable as tools for urban search-and-rescue tasks, power plant inspections and even archaeological exploration.”</p><p>Their findings are being published this week in the <em>Proceedings of the National Academy of Sciences</em> Early Edition.</p><p>The work is a continuation of collaboration between Choset; Daniel Goldman, a Georgia Tech associate professor of physics, and Joseph Mendelson III, director of research at Zoo Atlanta. An earlier study, published on Oct. 10, 2014, in the journal <em>Science</em>, analyzed the ability of sidewinders to quickly climb sandy slopes. It showed that despite the snake’s hundreds of body elements and thousands of muscles, the sidewinding motion could be simply modeled as a combination of a vertical and horizontal body wave.</p><p>With the model in hand and with a method to measure the movements of living snakes, the team, led by Henry Astley, a postdoctoral researcher in Goldman’s group, was able to observe that sidewinders make gradual changes in direction by altering the horizontal wave while keeping the vertical wave constant. They also discovered that making a large phase shift in the vertical wave enabled the snake to make a sharp turn in the opposite direction.</p><p>Applying these controls to the robot allowed the robot to replicate the turns of the snake, while also simplifying control.</p><p>“By looking for insights in nature, we were able to dramatically improve the control and maneuverability of the robot,” Astley said, “while at the same time using the robot as a tool to test the theorized control mechanisms of biological sidewinders.”</p><p>The modular snake robot used in this study was specifically designed to pass horizontal and vertical waves through its body to move in three-dimensional spaces. The robot is two inches in diameter and 37 inches long; its body consists of 16 joints, each joint arranged perpendicular to the previous one. That allows it to assume a number of configurations and to move using a variety of gaits – some similar to those of a biological snake.</p><p>This research was supported by the National Science Foundation, the Army Research Office, the Georgia Tech School of Biology and the Elizabeth Smithgall Watts Endowment. In addition to those already named, the research team included Miguel Serrano, Patricio Vela and David L. Hu of Georgia Tech, and Chaohui Gong, Jin Dai and Matthew Travers of Carnegie Mellon.</p><p><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia&nbsp; 30332-0181</strong><br /><br /><strong>Media Relations Contacts</strong>: John Toon, Georgia Tech: (404-894-6986) (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>) or Byron Spice, Carnegie Mellon (412-268-9068) (<a href="mailto:bspice@cs.cmu.edu">bspice@cs.cmu.edu</a>).</p><p><strong>Writer</strong>: Byron Spice, Carnegie Mellon</p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1427188908</created>  <gmt_created>2015-03-24 09:21:48</gmt_created>  <changed>1475895780</changed>  <gmt_changed>2016-10-08 03:03:00</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers who develop snake-like robots have picked up a few tricks from real sidewinder rattlesnakes on how to make rapid and even sharp turns with their undulating, modular device.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers who develop snake-like robots have picked up a few tricks from real sidewinder rattlesnakes on how to make rapid and even sharp turns with their undulating, modular device.]]></sentence>  <summary><![CDATA[<p>Researchers at Carnegie Mellon University who develop snake-like robots have picked up a few tricks from real sidewinder rattlesnakes on how to make rapid and even sharp turns with their undulating, modular device.</p>]]></summary>  <dateline>2015-03-24T00:00:00-04:00</dateline>  <iso_dateline>2015-03-24T00:00:00-04:00</iso_dateline>  <gmt_dateline>2015-03-24 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>389911</item>          <item>389921</item>          <item>389931</item>      </media>  <hg_media>          <item>          <nid>389911</nid>          <type>image</type>          <title><![CDATA[Sidewinder study]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[sidewinder023_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/sidewinder023_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/sidewinder023_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/sidewinder023_0.jpg?itok=1ST8KXgn]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sidewinder study]]></image_alt>                    <created>1449246312</created>          <gmt_created>2015-12-04 16:25:12</gmt_created>          <changed>1475894378</changed>          <gmt_changed>2016-10-08 02:39:38</gmt_changed>      </item>          <item>          <nid>389921</nid>          <type>image</type>          <title><![CDATA[Sidewinder study2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[sidewinder020.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/sidewinder020.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/sidewinder020.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/sidewinder020.jpg?itok=Ef7Nj-li]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sidewinder study2]]></image_alt>                    <created>1449246312</created>          <gmt_created>2015-12-04 16:25:12</gmt_created>          <changed>1475894349</changed>          <gmt_changed>2016-10-08 02:39:09</gmt_changed>      </item>          <item>          <nid>389931</nid>          <type>image</type>          <title><![CDATA[Snake robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[snake_robot.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/snake_robot.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/snake_robot.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/snake_robot.jpg?itok=GSjKa8Rl]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Snake robot]]></image_alt>                    <created>1449246312</created>          <gmt_created>2015-12-04 16:25:12</gmt_created>          <changed>1475894349</changed>          <gmt_changed>2016-10-08 02:39:09</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="47881"><![CDATA[Dan Goldman]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="170833"><![CDATA[sidwinder]]></keyword>          <keyword tid="169244"><![CDATA[snake robot]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="71363">  <title><![CDATA[Robot Fetches Objects With Just a Point and a Click]]></title>  <uid>27281</uid>  <body><![CDATA[<p>Robots are fluent in their native language of 1 and 0 absolutes but struggle to grasp the nuances and imprecise nature of human language. While scientists are making slow, incremental progress in their quest to create a robot that responds to speech, gestures and body language, a more straightforward method of communication may help robots find their way into homes sooner.</p><p>A team of researchers led by Charlie Kemp, director of the Center for Healthcare Robotics in the Health Systems Institute at the Georgia Institute of Technology and Emory University, have found a way to instruct a robot to find and deliver an item it may have never seen before using a more direct manner of communication - a laser pointer.</p><p>El-E (pronounced like the name Ellie), a robot designed to help users with limited mobility with everyday tasks, autonomously moves to an item selected with a green laser pointer, picks up the item and then delivers it to the user, another person or a selected location such as a table. El-E, named for her ability to elevate her arm and for the arm's resemblance to an elephant trunk, can grasp and deliver several types of household items including towels, pill bottles and telephones from floors or tables.</p><p>To ensure that El-E will someday be ready to roll out of the lab and into the homes of patients who need assistance, the Georgia Tech and Emory research team includes Prof. Julie Jacko, an expert on human-computer interaction and assistive technologies, and Dr. Jonathan Glass, director of the Emory ALS Center at the Emory University School of Medicine. El-E's creators are gathering input from ALS (also known as Lou Gehrig's disease) patients and doctors to prepare El-E to assist patients with severe mobility challenges.</p><p>The research was presented at the ACM/IEEE International Conference on Human-Robot Interaction in Amsterdam on March 14 and an associated workshop on 'Robotic Helpers' on March 12.</p><p>The verbal instructions a person gives to help someone find a desired object are very difficult for a robot to use (the cup over near the couch or the brush next to the red toothbrush). These types of commands require the robot to understand everyday human language and the objects it describes at a level well beyond the state of the art in language recognition and object perception.</p><p>"We humans naturally point at things but we aren't very accurate, so we use the context of the situation or verbal cues to clarify which object is important," said Kemp, an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory. "Robots have some ability to retrieve specific, predefined objects, such as a soda can, but retrieving generic everyday objects has been a challenge for robots."</p><p>The laser pointer interface and methods developed by Kemp's team overcome this challenge by providing a direct way for people to communicate the location of interest to El-E and complimentary methods that enable El-E to pick up an object found at this location. Through these innovations, El-E can retrieve objects without understanding what the object is or what it's called.</p><p>In addition to the laser pointer interface, El-E uses another approach to simplify its task. Indoors, objects are usually found on smooth, flat surfaces with uniform appearance, such as floors, tables, and shelves. Kemp's team designed El-E to take advantage of this common structure.</p><p>Regardless of the height, El-E uses the same strategies to localize and pick up the object by elevating its arm and sensors to match the height of the object's location. The robot's ability to reach objects both from the floor and shelves is particularly important for patients with mobility impairments since these locations can be difficult to reach, Kemp said.</p><p>El-E uses a custom-built camera that is omni-directional to see most of the room. After the robot detects that a selection has been made with the laser pointer, the robot moves two cameras to look at the laser spot and triangulate its position in three-dimensional space.</p><p>Next, the robot estimates where the item is in relation to its body and travels to the location. If the location is above the floor, the robot finds the edge of the surface on which the object is sitting, such as the edge of a table.</p><p>Picking up the unknown object is a significant challenge El-E faces in completing its task. It uses a laser range finder that scans across the surface to initially locate the object. Then, after moving its hand above the object, it uses a camera in its hand to visually distinguish the object from the texture of the floor or table. After refining the hand's position and orientation, it descends upon the object while using sensors in its hand to decide when to stop moving down and start closing its gripper. Finally, it closes its gripper upon the object until it has a secure grip.</p><p>Once the robot has picked up the item, the laser pointer can be used to guide the robot to another location to deposit the item or direct the robot to take the item to a person. El-E distinguishes between these two situations by looking for a face near the selected location.</p><p>If the robot detects a face, it carefully moves toward the person and presents the item to the user so it can be taken. It uses the location of the face and legs to determine where it will present the object.</p><p>If no face is detected near the location illuminated by the laser pointer, the robot decides whether the location is on a table or the floor. If it is on a table, El-E places the object on the table. If the location is on the floor El-E moves to the selected location on the floor.</p><p>After delivering the item, the robot returns to the user's side, ready to handle the next request.</p><p>El-E's power and computation is all on board (no tethers or hidden computers in the next room) and runs Ubuntu Linux on a Mac mini.</p><p>El-E's laser pointer interface and methods for autonomous mobile manipulation represent an important step toward robotic assistants in the home.</p><p>"If you want a robot to cook a meal or brush your hair, you will probably want the robot to first fetch the items it will need, and for tasks such as cleaning up around the home, it is essential that the robot be able to pick up objects and move them to new locations. We see object fetching as a core capability for future robots in healthcare settings, such as the home," Kemp said.</p><p>The Georgia Tech and Emory research team is now working to help El-E expand its capabilities to include switching lights on and off when the user selects a light switch and opening and closing doors when the user selects a door knob.</p>]]></body>  <author>Lisa Grovenstein</author>  <status>1</status>  <created>1205884800</created>  <gmt_created>2008-03-19 00:00:00</gmt_created>  <changed>1475895738</changed>  <gmt_changed>2016-10-08 03:02:18</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Robot designed to aid patients with limited movement]]></teaser>  <type>news</type>  <sentence><![CDATA[Robot designed to aid patients with limited movement]]></sentence>  <summary><![CDATA[<p>Researchers at Georgia Tech and Emory University have created a robot, designed to help users with limited mobility with everyday tasks, that moves autonomously to an item selected with a green laser pointer, picks up the item and then delivers it to the user, another person or a selected location such as a table. The new robotic communication method may help robots find their way into the home sooner.</p>]]></summary>  <dateline>2008-03-19T00:00:00-04:00</dateline>  <iso_dateline>2008-03-19T00:00:00-04:00</iso_dateline>  <gmt_dateline>2008-03-19 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[lisa.grovenstein@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Lisa Grovenstein</strong><br />Communications &amp; Marketing<br /><a href="http://www.gatech.edu/contact/index.html?id=lgrovenste3">Contact Lisa Grovenstein</a><br /><strong>404-894-8835</strong></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>71364</item>          <item>71365</item>      </media>  <hg_media>          <item>          <nid>71364</nid>          <type>image</type>          <title><![CDATA[El-E and Dr. Kemp]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177367</created>          <gmt_created>2015-12-03 21:16:07</gmt_created>          <changed>1475894634</changed>          <gmt_changed>2016-10-08 02:43:54</gmt_changed>      </item>          <item>          <nid>71365</nid>          <type>image</type>          <title><![CDATA[El-E]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177367</created>          <gmt_created>2015-12-03 21:16:07</gmt_created>          <changed>1475894634</changed>          <gmt_changed>2016-10-08 02:43:54</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.hsi.gatech.edu/cckemp/]]></url>        <title><![CDATA[Dr. Charlie Kemp]]></title>      </link>          <link>        <url><![CDATA[http://www.neurology.emory.edu/als]]></url>        <title><![CDATA[Emory ALS Center]]></title>      </link>          <link>        <url><![CDATA[http://bme.gatech.edu/]]></url>        <title><![CDATA[Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University]]></title>      </link>          <link>        <url><![CDATA[http://www.hsi.gatech.edu/hrl/]]></url>        <title><![CDATA[Healthcare Robotics Lab]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="2156"><![CDATA[ALS]]></keyword>          <keyword tid="2158"><![CDATA[Center for Healthcare Robotics]]></keyword>          <keyword tid="2157"><![CDATA[Charlie Kemp]]></keyword>          <keyword tid="2154"><![CDATA[El-E]]></keyword>          <keyword tid="2155"><![CDATA[healthcare robotics]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="72283">  <title><![CDATA[Georgia Tech to Host Music Technology Symposium]]></title>  <uid>27304</uid>  <body><![CDATA[<p>Georgia Tech's College of Architecture Dean's Symposium on the Changing Nature of Practice will focus on the emerging developments in music technology that promise to revolutionize musical performance, composition, analysis, and education.  </p><p>"This symposium focuses on music technology, and the College of Architecture Music Department just introduced Tech's first degree program in music," said Dean Thomas Galloway, College of Architecture.  "The symposium helps us roll out our master's degree in music technology and demonstrates to the arts community throughout Georgia and beyond that music is alive and well at Georgia Tech."</p><p>The symposium, which will be held March 3, 2007, will highlight three areas of interest in each session. The morning will begin with a session entitled 'Technology Meets Tradition: The Impact of Technology on Music Education'.  The second session discusses 'Cognition and Analysis: The "Why" of Music'. The third session,'Making Music and Performance,' will follow lunch. The final session of the day will focus on the relationship between 'Music and Architecture'.</p><p>"The Dean's Symposium is a wonderful event with a number of substantive outcomes," said Frank Clark, director of the Music Department. "Each year the event brings hundreds of visitors to the Tech campus, produces meaningful scholarship, generates debate, adds to our visibility and credibility, and celebrates the diversity and richness of the Georgia Tech College of Architecture."</p><p>Organizers are expecting a wide array of attendees, including several from other Georgia universities. Presenters include Georgia Tech professors Parag Chordia, Athanassios Economou, Jason Freeman, Ronald Lewcock, Jerry Ulrich, Bruce Walker, Gil Weinberg, and Music Department Director Frank Clark. Other presenters include David Huron, The Ohio State University; George Lewis, Columbia University; Henry Panion III, University of Alabama-Birmingham; Thomas Rudolph, director of music at School District of Haverford Township (PA); Pierre Ruhe, music critic for the Atlanta Journal-Constitution; and Jessica Peek Sherwood, Sonic Generator (flutist).</p><p>These scholars and practitioners will discuss ideas and demonstrate developments in areas ranging from new interfaces for musical expression and algorithmic composition to music information retrieval, music networks, and machine musicianship.</p><p>The College of Architecture has a unique relationship with its Music Department, and together they are forging a new future for Georgia Tech and music. So what's the future of Tech's music program?</p><p>"That's a question we ask every day and there are so many answers: new degree programs, new classes, new ensembles, groundbreaking research, innovative instruments, new modes of expression, and new partnerships combining music, architecture, computing, engineering, science and math," said Clark. "The future of music at Tech is ours to write, and I sincerely hope that it will be an Institute-wide composition."</p><p>The symposium is jointly sponsored by the Georgia Tech College of Architecture and the College of Architecture Alumni Committee and is organized by the College and its Music Department.</p>]]></body>  <author>Matthew Nagel</author>  <status>1</status>  <created>1172797200</created>  <gmt_created>2007-03-02 01:00:00</gmt_created>  <changed>1475895697</changed>  <gmt_changed>2016-10-08 03:01:37</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Dean's Symposium to discuss music technology]]></teaser>  <type>news</type>  <sentence><![CDATA[Dean's Symposium to discuss music technology]]></sentence>  <summary><![CDATA[Georgia Tech's College of Architecture Dean's Symposium on the Changing Nature of Practice will focus on the emerging developments in music technology that promise to revolutionize musical performance, composition, analysis, and education.]]></summary>  <dateline>2007-03-02T00:00:00-05:00</dateline>  <iso_dateline>2007-03-02T00:00:00-05:00</iso_dateline>  <gmt_dateline>2007-03-02 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Annual College of Architecture symposium to discuss music technology]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[matthew.nagel@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Georgia Tech Media Relations</strong><br />Laura Diamond<br /><a href="mailto:laura.diamond@comm.gatech.edu">laura.diamond@comm.gatech.edu</a><br />404-894-6016<br />Jason Maderer<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-660-2926</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>72284</item>      </media>  <hg_media>          <item>          <nid>72284</nid>          <type>image</type>          <title><![CDATA[Gil Weinberg, Director of Music Technology at Geor]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177454</created>          <gmt_created>2015-12-03 21:17:34</gmt_created>          <changed>1475894653</changed>          <gmt_changed>2016-10-08 02:44:13</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.coa.gatech.edu/symposium/]]></url>        <title><![CDATA[College of Architecture Dean\'s Symposium]]></title>      </link>          <link>        <url><![CDATA[http://www.coa.gatech.edu/music/]]></url>        <title><![CDATA[Georgia Tech Music Department]]></title>      </link>          <link>        <url><![CDATA[http://www.coa.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech College of Architecture]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="133"><![CDATA[Special Events and Guest Speakers]]></category>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="133"><![CDATA[Special Events and Guest Speakers]]></term>          <term tid="134"><![CDATA[Student and Faculty]]></term>          <term tid="148"><![CDATA[Music and Music Technology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="926"><![CDATA[College of Architecture]]></keyword>          <keyword tid="2078"><![CDATA[dean]]></keyword>          <keyword tid="1934"><![CDATA[Frank Clark]]></keyword>          <keyword tid="109"><![CDATA[Georgia Tech]]></keyword>          <keyword tid="1939"><![CDATA[Gil Weinberg]]></keyword>          <keyword tid="1346"><![CDATA[Jason Freeman]]></keyword>          <keyword tid="1180"><![CDATA[Music]]></keyword>          <keyword tid="167061"><![CDATA[symposium]]></keyword>          <keyword tid="623"><![CDATA[Technology]]></keyword>          <keyword tid="2468"><![CDATA[Tom Galloway]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="72504">  <title><![CDATA[Georgia Tech Scores RoboCup 2007 for Atlanta]]></title>  <uid>27301</uid>  <body><![CDATA[<p>The Georgia Institute of Technology has been selected to host RoboCup 2007, the world's most renowned research competition among custom-built robots and their designers. <em>RoboCup 2007 Atlanta</em>, scheduled for July 1-10, 2007, marks the first time that the event, featuring simulated soccer and search-and-rescue competitions, will be hosted entirely on a college campus and only the second time in the United States. Past host cities for the international tournament include Paris (1998), Seattle (2001), Lisbon (2004), Osaka, Japan (2005) and Bremen, Germany (2006).</p><p>"As host of RoboCup 2007, Georgia Tech welcomes the international robotics community to Atlanta," said Georgia Tech College of Computing Associate Professor and <em>RoboCup 2007 Atlanta </em>General Chair Tucker Balch. "Over the past few years, Georgia Tech has emerged as a global leader in robotics research and innovation, based upon its partnerships with industry leaders and our strengths in interactive and intelligent computing. By hosting the 11th annual RoboCup competition, Georgia Tech will have a great opportunity to showcase the technology leadership of the Institute and the City of Atlanta to researchers and scientists worldwide."</p><p><em>RoboCup 2007 Atlanta</em> will include approximately 218 senior robotic teams, and 140 junior teams from over 20 countries. These international teams will participate in soccer games and search-and-rescue missions, testing the limits in artificial intelligence and robotics research. The annual event, with sponsors including Microsoft, Lockheed Martin and CITIZEN, involves about 1500 students and faculty from leading universities around the world, as well as 500 middle school and high school students.</p><p>This year's RoboCup event will also feature the debut of the Nanogram League, a competition between microscopic robots. The MEMs (MicroElectroMechanical Systems) in competition can only be viewed via microscope, but attendees will be able to watch the contest via a magnified broadcast shown on large screens throughout the event. </p><p>The overall mission of the RoboCup research and education initiative is to foster artificial intelligence and robotics research by providing a standard problem where a wide range of technologies can be examined and integrated. The international project has a founding goal of developing a team of fully autonomous humanoid robots that can win against the human World Cup champion team by the year 2050. </p><p><em>RoboCup 2007 Atlanta</em> invites the public to Georgia Tech to watch as teams put their robots to work competing in realistic search-and-rescue demonstrations, as well as four-legged and humanoid soccer games. </p><p><strong><em>RoboCup 2007 Atlanta </em>Schedule:</strong><br />July 1: RoboCup Opening Ceremony<br />July 2-6: RoboCup Qualifying Competitions<br />July 7-8: RoboCup Finals<br />July 9-10: RoboCup Symposium</p><p>Georgia Tech's Campus Recreation Center (CRC) will serve as the main venue for most RoboCup events. In addition, Technology Square Research Building (TSRB) will be the site for simulation events and Georgia Tech's Student Center will be the main venue for the RoboCup Junior event. </p><p>In addition to hosting <em>RoboCup 2007 Atlanta </em>this summer, Georgia Tech will also play host to several other robotics industry events, including the Robotics: Science and Systems (RSS) Conference, a Robot Camp for Elementary and High School students and an International Aerial Robotics Competition. </p><p>For more information about <em>RoboCup 2007 Atlanta</em>, please visit <a href='http://www.robocup-us.org'>http://www.robocup-us.org</a>. </p><p><strong>About the RoboCup</strong><br />RoboCup is an international research and education initiative. Its goal is to foster artificial intelligence and robotics research by providing a standard problem where a wide range of technologies can be examined and integrated. The concept of soccer-playing robots was first introduced in 1993. Following a two-year feasibility study, in August 1995, an announcement was made on the introduction of the first international conferences and soccer games. In July 1997, the first official conference and games were held in Nagoya, Japan. Followed by Paris, Stockholm, Melbourne, Seattle, Fukuoka/Busan, Padua, Lisbon, Osaka and Bremen, the annual events attracted many participants and spectators. This year, the 11th anniversary of RoboCup, the competition and symposium is being held in Atlanta, Georgia. For more details about this year's RoboCup including participants and updated schedule, visit <a href="http://www.robocup-us.org/" title="http://www.robocup-us.org/">http://www.robocup-us.org/</a>.</p>]]></body>  <author>Elizabeth Campell</author>  <status>1</status>  <created>1165885200</created>  <gmt_created>2006-12-12 01:00:00</gmt_created>  <changed>1475895697</changed>  <gmt_changed>2016-10-08 03:01:37</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Tech to host world's largest robotics competition]]></teaser>  <type>news</type>  <sentence><![CDATA[Tech to host world's largest robotics competition]]></sentence>  <summary><![CDATA[Georgia Tech will host RoboCup 2007, the world's most renowned research competition for custom-built robots. RoboCup 2007 Atlanta, to be held July 1-10, 2007, marks the first time that the event will be hosted entirely on a college campus.]]></summary>  <dateline>2006-12-12T00:00:00-05:00</dateline>  <iso_dateline>2006-12-12T00:00:00-05:00</iso_dateline>  <gmt_dateline>2006-12-12 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[International Robot Superstars to Converge on Atlanta for Worldï¿½s Largest Robotics Research Competition]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[lisa.grovenstein@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>Lisa Grovenstein</strong><br />Communications &amp; Marketing<br /><a href="http://www.gatech.edu/contact/index.html?id=lgrovenste3">Contact Lisa Grovenstein</a><br /><strong>404-894-8835</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>72505</item>      </media>  <hg_media>          <item>          <nid>72505</nid>          <type>image</type>          <title><![CDATA[Four-legged robots play soccer]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177934</created>          <gmt_created>2015-12-03 21:25:34</gmt_created>          <changed>1475894658</changed>          <gmt_changed>2016-10-08 02:44:18</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.robocup-us.org/index.html]]></url>        <title><![CDATA[RoboCup 2007 Atlanta]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="133"><![CDATA[Special Events and Guest Speakers]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="133"><![CDATA[Special Events and Guest Speakers]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="2355"><![CDATA[balch]]></keyword>          <keyword tid="2559"><![CDATA[CITIZEN]]></keyword>          <keyword tid="2029"><![CDATA[Competition]]></keyword>          <keyword tid="2558"><![CDATA[Lockheed Martin]]></keyword>          <keyword tid="2557"><![CDATA[mems]]></keyword>          <keyword tid="335"><![CDATA[Microsoft]]></keyword>          <keyword tid="2555"><![CDATA[nanogram]]></keyword>          <keyword tid="2554"><![CDATA[rescue]]></keyword>          <keyword tid="2353"><![CDATA[robocup]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="2552"><![CDATA[robotic]]></keyword>          <keyword tid="167751"><![CDATA[search]]></keyword>          <keyword tid="168894"><![CDATA[search and rescue]]></keyword>          <keyword tid="167723"><![CDATA[soccer]]></keyword>          <keyword tid="2354"><![CDATA[tucker]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="70863">  <title><![CDATA[Robotic Technology Inspired by Service Dogs]]></title>  <uid>27281</uid>  <body><![CDATA[<p>Service dogs, invaluable companions providing assistance to physically impaired individuals, are an elite and desired breed.  Their presence in a home can make everyday tasks that are difficult - if not impossible - achievable, enhancing the quality of life for the disabled.</p><p>Yet with a cost averaging $16,000 per dog - not to mention the two years of training required to hone these skills - the demand for these canines' exceeds their availability.</p><p>But what if these duties could be accomplished with an electronic companion that provides the same efficiency at a fraction of the cost?</p><p>Researchers at the Georgia Institute of Technology have engineered a biologically inspired robot that mirrors the actions of sought-after service dogs. Users verbally command the robot to complete a task and the robot responds once a basic laser pointer illuminates the location of the desired action.</p><p>For instance, if a person needs an item fetched, that individual would normally command a service dog to do so and then gesture with their hands toward the location. The service robot mimics the process, with the hand gesture replaced by aiming the laser pointer at the desired item.</p><p>Employing this technology, users can accomplish basic yet challenging missions such as opening doors, drawers and retrieving medication.</p><p>"It's a road to get robots out there helping people sooner," said Professor Charlie Kemp, Georgia Tech Department of Biomedical Engineering.  "Service dogs have a great history of helping people, but there's a multi-year waiting list. It's a very expensive thing to have. We think robots will eventually help to meet those needs."</p><p>Kemp presented his findings this week at the second IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics - BioRob 2008 - in Scottsdale, Ariz. </p><p>This technology was achieved with four-legged authenticity.</p><p>Kemp and graduate student Hai Nguyen worked closely with the team of trainers at Georgia Canines for Independence (GCI) in Acworth, Ga. to research the command categories and interaction that is core to the relationship between individuals and service dogs.</p><p>Betty, a Golden Retriever, was studied to understand her movements and relationship with commands. Key to the success is Betty's ability to work with a towel attached to a drawer or door handle, which allows her to use her mouth for such actions as opening and closing. The robot was then successfully programmed to use the towel in a similar manner.</p><p>Her handlers were thrilled at the potential benefits of the technology.</p><p>"The waiting list for dogs can be five to seven years," said Ramona Nichols, executive director of Georgia Canines for Independence. "It's neat to see science happening but with a bigger cause; applying the knowledge and experience we have and really making a difference. I'm so impressed. It's going to revolutionize our industry in helping people with disabilities."</p><p>In total, the robot was able to replicate 10 tasks and commands taught to service dogs at GCI - including opening drawers and doors - with impressive efficiency. Other successes included opening a microwave oven, delivering an object and placing an item on a table.</p><p>"As robotic researchers we shouldn't just be looking at the human as an example," Kemp said. "Dogs are very capable at what they do. They have helped thousands of people throughout the years. I believe we're going to be able to achieve the capabilities of a service dog sooner than those of a human caregiver."</p><p>While the robot may not be able to mirror the personality and furry companionship of a canine, it does have other benefits.</p><p> "The robot won't require the same care and maintenance," Kemp said. "It also won't be distracted by a steak."</p>]]></body>  <author>Lisa Grovenstein</author>  <status>1</status>  <created>1224633600</created>  <gmt_created>2008-10-22 00:00:00</gmt_created>  <changed>1475895675</changed>  <gmt_changed>2016-10-08 03:01:15</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech obot mirrors the actions of service dogs.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech obot mirrors the actions of service dogs.]]></sentence>  <summary><![CDATA[Researchers at the Georgia Institute of Technology have engineered a biologically inspired robot that mirrors the actions of sought-after service dogs. Users verbally command the robot to complete a task and the robot responds once a basic laser pointer illuminates the location of the desired action.]]></summary>  <dateline>2008-10-22T00:00:00-04:00</dateline>  <iso_dateline>2008-10-22T00:00:00-04:00</iso_dateline>  <gmt_dateline>2008-10-22 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Mimicking the work of expensive canines could provide less-expensive alternative for the impaired]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[Don.fernandez@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>Don Fernandez</strong><br />Communications &amp; Marketing<br /><a href="http://www.gatech.edu/contact/index.html?id=dfernandez8">Contact Don Fernandez</a>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>70864</item>      </media>  <hg_media>          <item>          <nid>70864</nid>          <type>image</type>          <title><![CDATA[media:image:8f03927b-5fe3-4cc8-b816-a90dbc6a154c]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177328</created>          <gmt_created>2015-12-03 21:15:28</gmt_created>          <changed>1475894623</changed>          <gmt_changed>2016-10-08 02:43:43</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="249"><![CDATA[Biomedical Engineering]]></keyword>          <keyword tid="1968"><![CDATA[kemp]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="170770"><![CDATA[service dogs]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="71207">  <title><![CDATA[Robots Go Where Scientists Fear to Tread]]></title>  <uid>27281</uid>  <body><![CDATA[<p>Scientists are diligently working to understand how and why the world's ice shelves are melting. While most of the data they need (temperatures, wind speed, humidity, radiation) can be obtained by satellite, it isn't as accurate as good old-fashioned, on-site measurement and static ground-based weather stations don't allow scientists to collect info from as many locations as they'd like.</p><p>Unfortunately, the locations in question are volatile ice sheets, possibly cracking, shifting and filling with water - not exactly a safe environment for scientists.</p><p>To help scientists collect the more detailed data they need without risking scientists' safety, researchers at the Georgia Institute of Technology, working with Pennsylvania State University, have created specially designed robots called SnoMotes to traverse these potentially dangerous ice environments. The SnoMotes work as a team, autonomously collaborating among themselves to cover all the necessary ground to gather assigned scientific measurements. Data gathered by the Snomotes could give scientists a better understanding of the important dynamics that influence the stability of ice sheets.</p><p>"In order to say with certainty how climate change affects the world's ice, scientists need accurate data points to validate their climate models," said Ayanna Howard, lead on the project and an associate professor in the School of Electrical and Computer Engineering at Georgia Tech. "Our goal was to create rovers that could gather more accurate data to help scientists create better climate models. It's definitely science-driven robotics."</p><p>Howard unveiled the SnoMotes at the IEEE International Conference on Robotics and Automation (ICRA) in Pasadena on May 23. The SnoMotes will also be part of an exhibit at the Chicago Museum of Science and Industry in June. The research was funded by a grant from NASA's Advanced Information Systems Technology (AIST) Program.</p><p>Howard, who previously worked with rovers at NASA's Jet Propulsion Laboratory, is working with Magnus Egerstedt, an associate professor in the School of Electrical and Computer Engineering, and Derrick Lampkin, an assistant professor in the Department of Geography at Penn State who studies ice sheets and how changes in climate contribute to changes in these large ice masses. Lampkin currently takes ice sheet measurements with satellite data and ground-based weather stations, but would prefer to use the more accurate data possible with the simultaneous ground measurements that efficient rovers can provide.</p><p>"The changing mass of Greenland and Antarctica represents the largest unknown in predictions of global sea-level rise over the coming decades. Given the substantial impact these structures can have on future sea levels, improved monitoring of the ice sheet mass balance is of vital concern," Lampkin said. "We're developing a scale-adaptable, autonomous, mobile climate monitoring network capable of capturing a range of vital meteorological measurements that will be employed to augment the existing network and capture multi-scale processes under-sampled by current, stationary systems.' </p><p>The SnoMotes are autonomous robots and are not remote-controlled. They use cameras and sensors to navigate their environment. Though current prototype models don't include a full range of sensors, the robots will eventually be equipped with all the sensors and instruments needed to take measurements specified by the scientist.</p><p>While Howard's team works on versatile robots with the mobility and Artificial Intelligence (A.I.) skills to complete missions, Lampkin's team will be creating a sensor package for later versions of Howard's rovers.</p><p>Here's how the SnoMotes will work when they're ready for their glacial missions: The scientist will select a location for investigation and decide on a safe 'base camp' from which to release the SnoMotes. The SnoMotes will then be programmed with their assigned coverage area and requested measurements. The researcher will monitor the SnoMotes' progress and even reassign locations and data collection remotely from the camp as necessary.</p><p>When Howard's research team first set out to build a rover designed to capture environmental data from the field, it took a few tries to come up with an effectively hearty design. The group's first rover was delicate and ineffective. But after an initial failure, they decided to move on to something designed for consistent abuse - a toy. Instead of building yet another expensive prototype, Howard instead opted to start with a sturdy kit snowmobile, already primed for snow conditions and designed for heavy use by a child.</p><p>Howard's group then installed a camera and all necessary computing and sensor equipment inside the 2-foot-long, 1-foot-wide snowmobile. The result was a sturdy but inexpensive rover.</p><p>By using existing kits and adding a few extras like sensors, circuits, A.I. and a camera, the team was able to create an expendable rover that wouldn't break a research team's bank if it were lost during an experiment, Howard said. Similar rovers under development at other universities are much more expensive, and the cost of sending several units to canvas an area would likely be cost-prohibitive for most researchers, she added.</p><p>The first phase of the project is focused primarily on testing the mobility and communications capabilities of the SnoMote rovers. Later versions of the rovers will include a more developed sensor package and larger rovers.</p><p>The team has created three working SnoMote models so far, but as many SnoMotes as necessary can work together on a mission, Howard said.</p><p>The SnoMote represents two key innovations in rovers: a new method of location and work allocation communication between robots and maneuvering in ice conditions.</p><p>Once placed on site, the robots place themselves at strategic locations to make sure all the assigned ground is covered. Howard and her team are testing two different methods that allow the robots to decide amongst themselves which positions they will take to get all the necessary measurements.</p><p>The first is an 'auction' system that lets the robots 'bid' on a desired location, based on their proximity to the location (as they move) and how well their instruments are working or whether they have the necessary instrument (one may have a damaged wind sensor or another may have low battery power).</p><p>The second method is more mathematical, fixing the robots to certain positions in a net of sorts that is then stretched to fit the targeted location. Magnus Egerstedt is working with Howard on this work allocation method.</p><p>In addition to location assignments, another key innovation of the SnoMote is its ability to find its way in snow conditions. While most rovers can use rocks or other landmarks to guide their movement, snow conditions present an added challenge by restricting topography and color (everything is white) from its guidance systems. </p><p>For snow conditions, one of Howard's students discovered that the lines formed by snow banks could serve as markers to help the SnoMote track distance traveled, speed and direction. The SnoMote could also navigate via GPS if snow bank visuals aren't available.</p><p>While the SnoMotes are expected to pass their first real field test in Alaska next month, a heartier, more cold-resistant version will be needed for the Antarctic and other well below zero climates, Howard said. These new rovers would include a heater to keep circuitry warm enough to function and sturdy plastic exterior that wouldn't become brittle in extreme cold.</p>]]></body>  <author>Lisa Grovenstein</author>  <status>1</status>  <created>1211846400</created>  <gmt_created>2008-05-27 00:00:00</gmt_created>  <changed>1475895670</changed>  <gmt_changed>2016-10-08 03:01:10</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Rovers traverse dangerous ice environments]]></teaser>  <type>news</type>  <sentence><![CDATA[Rovers traverse dangerous ice environments]]></sentence>  <summary><![CDATA[Researchers at the Georgia Institute of Technology have created specially designed robots called SnoMotes to traverse potentially dangerous ice environments. The SnoMotes work as a team, autonomously collaborating among themselves to gather data that could give scientists a better understanding of the important dynamics that influence the stability of ice sheets.]]></summary>  <dateline>2008-05-27T00:00:00-04:00</dateline>  <iso_dateline>2008-05-27T00:00:00-04:00</iso_dateline>  <gmt_dateline>2008-05-27 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[lisa.grovenstein@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>Lisa Grovenstein</strong><br />Communications &amp; Marketing<br /><a href="http://www.gatech.edu/contact/index.html?id=lgrovenste3">Contact Lisa Grovenstein</a><br /><strong>404-894-8835</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>71208</item>          <item>71209</item>          <item>71210</item>      </media>  <hg_media>          <item>          <nid>71208</nid>          <type>image</type>          <title><![CDATA[Ayanna Howard]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177358</created>          <gmt_created>2015-12-03 21:15:58</gmt_created>          <changed>1475894630</changed>          <gmt_changed>2016-10-08 02:43:50</gmt_changed>      </item>          <item>          <nid>71209</nid>          <type>image</type>          <title><![CDATA[Ayanna and the SnoMote]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177358</created>          <gmt_created>2015-12-03 21:15:58</gmt_created>          <changed>1475894632</changed>          <gmt_changed>2016-10-08 02:43:52</gmt_changed>      </item>          <item>          <nid>71210</nid>          <type>image</type>          <title><![CDATA[SnoMote]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177358</created>          <gmt_created>2015-12-03 21:15:58</gmt_created>          <changed>1475894632</changed>          <gmt_changed>2016-10-08 02:43:52</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.ece.gatech.edu/faculty-staff/fac_profiles/bio.php?id=135]]></url>        <title><![CDATA[Profile]]></title>      </link>          <link>        <url><![CDATA[http://www.ece.gatech.edu/]]></url>        <title><![CDATA[School of Electrical and Computer Engineering]]></title>      </link>          <link>        <url><![CDATA[http://humanslab.ece.gatech.edu/]]></url>        <title><![CDATA[Human-Automation Systems Lab (HumAnS)]]></title>      </link>          <link>        <url><![CDATA[http://www.geog.psu.edu/people/lampkin/]]></url>        <title><![CDATA[Dr. Derrick Lampkin]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="825"><![CDATA[Ayanna Howard]]></keyword>          <keyword tid="1925"><![CDATA[Electrical and Computer Engineering]]></keyword>          <keyword tid="2090"><![CDATA[Lampkin]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="170766"><![CDATA[SnoMote]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="71445">  <title><![CDATA[Tech Offers First Interdisciplinary Robotics Ph.D.]]></title>  <uid>27310</uid>  <body><![CDATA[<p>The Colleges of Computing and Engineering at Georgia Tech today announced the nation's first interdisciplinary doctoral degree in robotics to be offered at Georgia Tech. The program, which starts fall semester of 2008, was developed through Georgia Tech's Center for Robotics and Intelligent Machines (RIM@Georgia Tech), a collaborative research center that combines the educational strength and expertise of both units. Reaching across disciplines and drawing from curricula in computer science, electrical and computer engineering, aerospace, biomedical engineering and mechanical engineering, the doctoral degree is designed to educate a new breed of multidisciplinary researchers who will enter the market best prepared to chart a new course for robotics in the United States. </p><p>"We are pleased to offer the first truly interdisciplinary robotics Ph.D. program in the country," said Dr. Henrik Christensen, KUKA Chair of Robotics for the College of Computing at Georgia Tech. "Exposing our students to course work from multiple disciplines early on prepares them to think about robotics from a holistic approach once they enter the workforce. True to our mission in robotics at Georgia Tech, our program will recruit and educate outstanding students who will provide leadership in a world that is increasingly dependent on technology."</p><p>According to robotics industry associations in North America and Japan, the global robotics market is expected to significantly expand over the next five years, including gains in both the service and personal robotics fields. With a focus on personal and everyday robotics, as well as the future of automation, faculty involved with RIM@Georgia Tech developed the doctoral degree program to best enable students to understand and drive the future role of robotics in society and industry. Approximately 15 candidates per year are expected to be admitted, gradually building the program to 60 enrolled students. </p><p>"Over the next five to ten years, robotics technologies will become more integrated throughout various industries that directly impact human activity and culture, such as healthcare, food processing, logistics and others," said Dr. Christensen. "At Georgia Tech, our doctorate students will be guided through their research by at least two faculty members from distinct participating schools, providing more insight and expertise into a specific industry sector or focus area."  </p><p>Students in the Robotics Ph.D. program must first be admitted to one of the participating academic units, subsequently designated as the student's home unit. Students will then progress through the course requirements consisting of 36 semester hours of core research and elective courses, the passing a comprehensive qualifying exam with written and oral components, and the successful completion, documentation and defense of a piece of original research culminating in a doctoral thesis. </p><p>Over 30 faculty members from the schools of Interactive Computing, Mechanical Engineering, Aerospace Engineering, Electrical and Computer Engineering, and Biomedical Engineering are affiliated with this new Ph.D. program. Faculty involved in the development of the new doctoral program include Henrik Christensen (College of Computing), Frank Dellaert (College of Computing), Eric Johnson (School of Aerospace Engineering), Ayanna Howard (School of Electrical and Computer Engineering), Steve DeWeerth (Department of Biomedical Engineering), and Harvey Lipkin (School of Mechanical Engineering).</p><p><strong>About the Robotics &amp; Intelligent Machines at Georgia Tech (RIM@GT)</strong><br />The Center for Robotics and Intelligent Machines (RIM@Georgia Tech) leverages the strengths and resources of Georgia Tech in robotics education, research, and leadership by reaching across traditional boundaries to embrace a multidisciplinary approach. The College of Computing, College of Engineering and the Georgia Tech Research Institute play key, complementary roles through Tech's traditional expertise in interactive and intelligent computing, control, and mechanical engineering. Emphasizing personal and everyday robotics as well as the future of automation, faculty involved with RIM@Georgia Tech help students understand and define the future role of robotics in society. <a href="http://www.robotics.gatech.edu" title="www.robotics.gatech.edu">www.robotics.gatech.edu</a></p><p><strong>About the College of Engineering at Georgia Tech</strong><br />The College of Engineering at Georgia Tech is the largest engineering program in the U.S. and ranked 4th among the country's best graduate programs by U.S. News and World Report. A respected leader in interdisciplinary research and education, the College of Engineering grants the highest number of engineering degrees in the nation across nine fields of study. For more information about the programs in the College of Engineering, please visit <a href="http://www.coe.gatech.edu" title="www.coe.gatech.edu">www.coe.gatech.edu</a>.</p><p><strong>About the College of Computing at Georgia Tech</strong><br />The College of Computing at Georgia Tech is a national leader in the creation of real-world computing breakthroughs that drive social and scientific progress. With its graduate program ranked 11th nationally by U.S. News and World Report, the College's unconventional approach to education is defining the new face of computing by expanding the horizons of traditional computer science students through interdisciplinary collaboration and a focus on human centered solutions. For more information about the College of Computing at Georgia Tech, its academic divisions and research centers, please visit <a href="http://www.cc.gatech.edu" title="www.cc.gatech.edu">www.cc.gatech.edu</a>.</p>]]></body>  <author>David Terraso</author>  <status>1</status>  <created>1201654800</created>  <gmt_created>2008-01-30 01:00:00</gmt_created>  <changed>1475895670</changed>  <gmt_changed>2016-10-08 03:01:10</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Program to start in Fall 2008]]></teaser>  <type>news</type>  <sentence><![CDATA[Program to start in Fall 2008]]></sentence>  <summary><![CDATA[The Colleges of Computing and Engineering at Georgia Tech announced the nation's first interdisciplinary doctoral degree in robotics to be offered at Georgia Tech. The program starts fall semester of 2008 and was developed through Georgia Tech's Center for Robotics and Intelligent Machines (RIM@Georgia Tech).]]></summary>  <dateline>2008-02-08T00:00:00-05:00</dateline>  <iso_dateline>2008-02-08T00:00:00-05:00</iso_dateline>  <gmt_dateline>2008-02-08 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[press@robocup-us.org]]></email>  <location></location>  <contact><![CDATA[<strong>Rebecca Biggs</strong><br />GCI Group<br /><a href="mailto:press@robocup-us.org">Contact Rebecca Biggs</a><br /><strong>404-260-3510</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>71446</item>      </media>  <hg_media>          <item>          <nid>71446</nid>          <type>image</type>          <title><![CDATA[Rescue Robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177376</created>          <gmt_created>2015-12-03 21:16:16</gmt_created>          <changed>1475894637</changed>          <gmt_changed>2016-10-08 02:43:57</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://humanslab.ece.gatech.edu/]]></url>        <title><![CDATA[Human-Automation Systems Lab (HumAnS)]]></title>      </link>          <link>        <url><![CDATA[http://www.imdl.gatech.edu/]]></url>        <title><![CDATA[Intelligent Machine Dynamics]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/news/robot-ethics-proposal-funded-by-dod]]></url>        <title><![CDATA[Robot Ethics]]></title>      </link>          <link>        <url><![CDATA[http://www.roboteducation.org/]]></url>        <title><![CDATA[Institute for Personal Robots in Education]]></title>      </link>          <link>        <url><![CDATA[http://www.robotics.gatech.edu/]]></url>        <title><![CDATA[Robotics at Georgia Tech]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="208"><![CDATA[computing]]></keyword>          <keyword tid="2212"><![CDATA[Doctoral]]></keyword>          <keyword tid="1096"><![CDATA[Ph.D.]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="71872">  <title><![CDATA[Simulation Reveals How Body Repairs Balance]]></title>  <uid>27281</uid>  <body><![CDATA[<p>Your body goes to a lot of trouble to make sure you stay upright. But when the brain's neural pathways are impaired through injury, age or illness, muscles are deprived of the detailed sensory information they need to perform the constant yet delicate balancing act required for normal movement and standing.</p><p>With an eye towards building robots that can balance like humans, researchers at Georgia Tech and Emory University have created a computer simulation that sheds new light on how the nervous system reinvents its communication with muscles after sensory loss. The findings could someday be used to better diagnose and rehabilitate patients with balance problems (through normal aging or diseases such as Multiple Sclerosis or Parkinson's) by retraining their muscles and improving overall balance. The research will be published in the October issue of Nature Neuroscience. </p><p>"The ultimate goal of rehabilitation is for patients to find the best way to adapt to their particular deficit. This system may help predict what the optimum combination of muscle and nerve activity looks like for each patient, helping patients and doctors set realistic goals and speeding recovery," said Lena Ting, lead researcher on the project and an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University. </p><p>In a body without balance impairment, the nervous system collects sensory information from all over the body (skin, ears, feet, arms, eyes, etc.) and transmits this information to the muscles that control balance. When that information changes through the introduction of something like a strong wind, a raised crack in the pavement or an accidental bump from a nearby stranger, the nervous system sends the new information to the muscles and they adjust accordingly to maintain the body's balance.</p><p>Impairments and injuries to the nervous system or the senses that report to the nervous system (experienced with a loss of vision or touch and problems in the inner ear) lead to balance problems. Experts traditionally have had little understanding of how the nervous system's communication with the muscles associated with balance changes when one or several pieces of necessary sensory information are missing.</p><p>Georgia Tech and Emory researchers set out to create an effective way to interpret how commands from the nervous system to muscles (measured through electrical signals in the muscles) are changed by sensory impairment - similar to the numbing of feet experienced by diabetes patients - and how these changes affect balance control. The team started with data sets from animals. They were able to determine that, after a period of rehabilitation, subjects with some sensory damage were able to regain their balance despite the loss of some sensory information. So how do the nervous system and muscles fill in the information gaps?</p><p>The Georgia Tech and Emory team hypothesized that the nervous system relies on the relationship between the body's center of gravity and its environment to control balance. They reasoned that the best predictor of how muscles would be activated when the subject experienced a balance threat was not the motion of the individual body parts, but the horizontal motion of the body's center of gravity.</p><p>To test their theory, the researchers created a computer simulation that could accurately simulate standing balance and muscle reactions to balance disturbances by focusing on the relation of the subject's center of gravity to the ground. Rather than predicting neural control patterns for the multitude of sensory information processed by the body to maintain balance, the team instead tracked a small set of signals related to the body's control of its center of gravity.</p><p>The Georgia Tech and Emory team determined that subjects who had impaired sensory information were slowly using new sensory pathways to track the motion of the body's center of gravity, compensating for the loss of information from the damaged sensory pathways. In effect, the subjects' muscles were using different neural information to perform the same balance tasks, resulting in muscle activity patterns that looked 'abnormal,' but that were actually similar to the predicted optimum.</p><p>The research team is now testing its center of gravity simulation with human subjects and a small robot with simulated muscles. They predict that the simulation could recognize impairment and pinpoint the optimum recovery points for each sensory-impaired subject - all based on the body's reliance on center of gravity information. When applied to a robot, these neural communication patterns allowed the robot to successfully move fluidly like an animal, in contrast to what its gears and motors might suggest. The robot demonstrates all of the different strategies that could be used by normal and sensory-loss patients.</p><p>"This finding will change the way we approach rehabilitation," Ting said. "We can't expect patients to mimic normal balance performance when they're using a different set of sensory information. Instead, our work can help identify the best performance possible given a patient's level and type of sensory impairment."</p>]]></body>  <author>Lisa Grovenstein</author>  <status>1</status>  <created>1190678400</created>  <gmt_created>2007-09-25 00:00:00</gmt_created>  <changed>1475895665</changed>  <gmt_changed>2016-10-08 03:01:05</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Could lead to better rehabilitation, robot balance]]></teaser>  <type>news</type>  <sentence><![CDATA[Could lead to better rehabilitation, robot balance]]></sentence>  <summary><![CDATA[Georgia Tech and Emory researchers have created a computer simulation that sheds new light on how the nervous system reinvents its communication with muscles after sensory loss. The findings could someday be used to better diagnose and rehabilitate patients with balance problems by retraining their muscles and improving overall balance.]]></summary>  <dateline>2007-09-25T00:00:00-04:00</dateline>  <iso_dateline>2007-09-25T00:00:00-04:00</iso_dateline>  <gmt_dateline>2007-09-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Researchers design simulation that could be used to better rehabilitate patients with balance problems, build robots with better balance]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[lisa.grovenstein@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<strong>Lisa Grovenstein</strong><br />Communications &amp; Marketing<br /><a href="http://www.gatech.edu/contact/index.html?id=lgrovenste3">Contact Lisa Grovenstein</a><br /><strong>404-894-8835</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>71873</item>          <item>71874</item>      </media>  <hg_media>          <item>          <nid>71873</nid>          <type>image</type>          <title><![CDATA[balance]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177414</created>          <gmt_created>2015-12-03 21:16:54</gmt_created>          <changed>1475894644</changed>          <gmt_changed>2016-10-08 02:44:04</gmt_changed>      </item>          <item>          <nid>71874</nid>          <type>image</type>          <title><![CDATA[Ting and Chvatl]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177414</created>          <gmt_created>2015-12-03 21:16:54</gmt_created>          <changed>1475894644</changed>          <gmt_changed>2016-10-08 02:44:04</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.neuro.gatech.edu/groups/ting/index.html]]></url>        <title><![CDATA[Lena Ting\'s Neuromechanics Lab]]></title>      </link>          <link>        <url><![CDATA[http://www.bme.gatech.edu/]]></url>        <title><![CDATA[Wallace H. Coulter Department of Biomedical Engineering]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="2265"><![CDATA[balance]]></keyword>          <keyword tid="2266"><![CDATA[Lena Ting]]></keyword>          <keyword tid="2267"><![CDATA[multiple sclerosis]]></keyword>          <keyword tid="2268"><![CDATA[nervous system]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="71994">  <title><![CDATA[Ga. Tech Sting Racing Team Selected as Finalist]]></title>  <uid>27281</uid>  <body><![CDATA[<p>Georgia Tech's College of Computing today announced that the Sting Racing team competing in the Defense Advanced Research Projects Agency's (DARPA) Urban Challenge has passed its site visit and is one of 36 teams judged technologically capable of competing in the final round. The team's autonomous vehicle, Sting 1, successfully completed all four tests during its capabilities evaluation on June 18, taking it into the next stage in this two-year competition among leading research and technology universities in the United States.</p><p>"As a first year competitor in the Urban Challenge, qualifying for the semi-final round is a major accomplishment and testament to the passion and dedication of our team," said Dr. Henrik Christensen, KUKA Chair of Robotics for the College of Computing at Georgia Tech and Principal Investigator for Sting Racing. "Our robotics program at Georgia Tech is relatively new, but the progress we have shown over a short period of time has positioned us among the best in the nation."</p><p>During the visit, DARPA personnel assessed the ability of the autonomous vehicle to perform tasks and operate safely. Sting was evaluated on its ability to navigate a test course that included a four-way intersection, and moving traffic. This evaluation cover a subset of the challenges that the robotic vehicles will face on the final Urban Challenge course, including merging into moving traffic, navigating traffic circles, negotiating busy intersections and avoiding obstacles.</p><p>Sting Racing, a joint collaboration between Georgia Tech's College of Computing, College of Engineering, the Georgia Tech Research Institute and SAIC, selected a Porsche Cayenne, designated Sting 1, as the base vehicle for its entry in the Urban Design Challenge. For nearly a year the members of the Sting Racing team have been working to program the robot to drive autonomously by staying on course and recognizing obstacles in its way, such as other cars.<br />"We have put in a lot of long hours over the past year preparing Sting 1 for this site visit - the first major trial in the Urban Grand Challenge," noted Matt Powers, a student at Georgia Tech and member of the Sting Racing team. "So passing all four tests during the site visit was extremely rewarding. We look forward now to making it all the way to the finals."</p><p>DARPA uses the site visit evaluation to select the competition's semi-finalists - the top 36 teams that will participate in the National Qualification Event (NQE), an exercise to demonstrate the safety of the vehicles on October 21-31. Earlier this afternoon, DARPA announced the other semi-finalists as well as the location of the NQE and Urban Challenge - the former George Air Force Base in Victorville, California. </p><p>The Urban Challenge is the third in a series of DARPA-sponsored competitions to foster the development of robotic ground vehicle technology without a human operator, designed for use on the battlefield. The Urban Challenge, set for November 3, 2007, will feature autonomous ground vehicles executing simulated military supply missions safely and effectively in a mock urban area. Safe operation in traffic is essential to U.S. military plans to use autonomous ground vehicles to conduct important missions and keep American personnel out of harm's way. DARPA will award $2 million, $1 million and $500,000 awards to the top three finishers that complete the course within the six-hour time limit.</p><p>The Sting 1 Porsche Cayenne is available for media demonstrations. For more information, visit <a href="http://www.sting-racing.org" title="www.sting-racing.org">www.sting-racing.org</a>.</p>]]></body>  <author>Lisa Grovenstein</author>  <status>1</status>  <created>1186617600</created>  <gmt_created>2007-08-09 00:00:00</gmt_created>  <changed>1475895665</changed>  <gmt_changed>2016-10-08 03:01:05</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Team Passes Site Visit and Heads to Finals in Fall]]></teaser>  <type>news</type>  <sentence><![CDATA[Team Passes Site Visit and Heads to Finals in Fall]]></sentence>  <summary><![CDATA[The Sting Racing team will be one of 36 teams competing in the Defense Advanced Research Project Agency's (DARPA) Urban Challenge this fall.]]></summary>  <dateline>2007-08-09T00:00:00-04:00</dateline>  <iso_dateline>2007-08-09T00:00:00-04:00</iso_dateline>  <gmt_dateline>2007-08-09 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[rbiggs@gcigroup.com]]></email>  <location></location>  <contact><![CDATA[<strong>Becky Biggs</strong><br />GCI Atlanta<br /><a href="http://www.gatech.edu/contact/index.html?id=0">Contact Becky Biggs</a><br /><strong>404-260-3510</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>      </media>  <hg_media>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.darpa.mil/grandchallenge/index.asp]]></url>        <title><![CDATA[DARPA]]></title>      </link>          <link>        <url><![CDATA[http://www.coc.gatech.edu/]]></url>        <title><![CDATA[College of Computing]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="690"><![CDATA[darpa]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="170760"><![CDATA[Sting]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="72050">  <title><![CDATA[Robots from 37 Countries Clash at RoboCup 2007]]></title>  <uid>27310</uid>  <body><![CDATA[<p>Nearly 300 teams from 37 countries are gearing up to compete at RoboCup 2007 Atlanta, the world's most renowned competition for research robotics, at the Georgia Institute of Technology July 3-10.</p><p>"One of RoboCup's great strengths is its diverse international flavor," said Tucker Balch,  Georgia Tech College of Computing associate professor and RoboCup 2007 Atlanta general chair. "We are able to get people together from many countries and backgrounds to share our research and ideas for making robots more effective."</p><p>China, Japan, Iran, Israel, Germany and Brazil are just a few of the countries being represented at the robotics showcase. In all, approximately 1,700 students and faculty from leading universities, high schools, middle schools and elementary schools will compete in events ranging from four-legged and humanoid robotic soccer games to search-and-rescue competitions. This year's event features a demonstration of the Nanogram League, a competition between microscopic robots. </p><p>This year's contest also marks the first time since 2001 that RoboCup has been held on a university campus. </p><p>"RoboCup has an ambitious goal -- namely to field a robot soccer team that can defeat the human world champions by 2050.  This goal is meant to drive robotics research and education forward faster, and nearly all RoboCup participants come from research universities," said Balch. "So, it makes perfect sense that RoboCup should return to its roots on a university campus."</p><p>RoboCup 2007 Atlanta invites interested media to register online to attend and receive updates at <a href="http://www.robocup-us.org/press/" title="www.robocup-us.org/press/">www.robocup-us.org/press/</a>.</p><p>KUKA Robotics Corporation, a leading global manufacturer of industrial robots, is the event's premier sponsor. Other major sponsors include Microsoft, CITIZEN, Lockheed Martin and the National Science Foundation. </p><p>This summer is Robot Summer at Georgia Tech. In addition to RoboCup 2007 Atlanta, Georgia Tech will also host several other robotics-related events, including the Robotics: Science and Systems (RSS) conference and an International Aerial Robotics Competition.</p><p><strong>Countries represented at RoboCup 2007 Atlanta:</strong><br />Australia<br />Austria<br />Brazil<br />Bulgaria<br />Canada<br />Chile<br />China<br />Colombia<br />Costa Rica<br />Finland<br />Germany<br />Greece<br />Hungary<br />India<br />Iran<br />Ireland<br />Israel<br />Italy<br />Japan<br />Mexico<br />Netherlands<br />New Zealand<br />Norway<br />Portugal<br />Romania<br />Saudi Arabia<br />Singapore<br />Slovakia<br />Spain<br />Sweden<br />Switzerland<br />Taiwan<br />Thailand<br />Turkey<br />United Arab Emirates<br />United Kingdom<br />United States</p>]]></body>  <author>David Terraso</author>  <status>1</status>  <created>1181692800</created>  <gmt_created>2007-06-13 00:00:00</gmt_created>  <changed>1475895650</changed>  <gmt_changed>2016-10-08 03:00:50</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[July 3-10]]></teaser>  <type>news</type>  <sentence><![CDATA[July 3-10]]></sentence>  <summary><![CDATA[Nearly 300 teams from 33 countries are gearing up to compete at RoboCup 2007 Atlanta, the world's most renowned competition for research robotics, at the Georgia Institute of Technology July 3-10.]]></summary>  <dateline>2007-06-13T00:00:00-04:00</dateline>  <iso_dateline>2007-06-13T00:00:00-04:00</iso_dateline>  <gmt_dateline>2007-06-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Soccer-Playing, Search-and-Rescue Robots and Nanobots Featured in Worldï¿½s Largest Research Robotics Competition Next Month]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[press@robocup-us.org]]></email>  <location></location>  <contact><![CDATA[<strong>Rebecca Biggs</strong><br />GCI Group<br /><a href="mailto:press@robocup-us.org">Contact Rebecca Biggs</a><br /><strong>404-260-3510</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>72051</item>      </media>  <hg_media>          <item>          <nid>72051</nid>          <type>image</type>          <title><![CDATA[Robot Dog]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177434</created>          <gmt_created>2015-12-03 21:17:14</gmt_created>          <changed>1475894649</changed>          <gmt_changed>2016-10-08 02:44:09</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.robocup-us.org/press/]]></url>        <title><![CDATA[RoboCup Press Registration]]></title>      </link>          <link>        <url><![CDATA[http://www.kukarobotics.com/]]></url>        <title><![CDATA[KUKA Robotics]]></title>      </link>          <link>        <url><![CDATA[http://www.robotics.gatech.edu/]]></url>        <title><![CDATA[Robotics at Georgia Tech]]></title>      </link>          <link>        <url><![CDATA[http://www.robocup.org/]]></url>        <title><![CDATA[RoboCup]]></title>      </link>          <link>        <url><![CDATA[http://www.robocup-us.org/]]></url>        <title><![CDATA[RoboCup 2007 Atlanta]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="133"><![CDATA[Special Events and Guest Speakers]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="149"><![CDATA[Nanotechnology and Nanoscience]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="133"><![CDATA[Special Events and Guest Speakers]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="149"><![CDATA[Nanotechnology and Nanoscience]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="2355"><![CDATA[balch]]></keyword>          <keyword tid="2029"><![CDATA[Competition]]></keyword>          <keyword tid="2286"><![CDATA[nano]]></keyword>          <keyword tid="2353"><![CDATA[robocup]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>          <keyword tid="2354"><![CDATA[tucker]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="72174">  <title><![CDATA[Counting Down to RoboCup 2007 Atlanta]]></title>  <uid>27310</uid>  <body><![CDATA[<p>The countdown begins for RoboCup 2007 Atlanta. The world's most renowned competition for research robotics, RoboCup 2007 Atlanta will be held at Georgia Tech July 3-10. Approximately 2,000 students and faculty from leading universities, high schools and middle schools from more than 20 countries will descend on Tech's campus to participate in events ranging from four-legged and humanoid robotic soccer games to search-and-rescue competitions. This year features a demonstration of the Nanogram League, a competition between microscopic robots. KUKA Robotics Corporation, a leading global manufacturer of industrial robots, is the event's premier sponsor.</p><p>"As an emerging global leader in robotics research and innovation, Georgia Tech is pleased to host RoboCup 2007," said Tucker Balch, Georgia Tech College of Computing associate professor and RoboCup 2007 Atlanta general chair. "We welcome the international robotics community to our campus and look forward to the exciting competition."</p><p>RoboCup 2007 Atlanta invites interested media to register online to attend and receive updates at <a href="http://www.robocup-us.org/press/" title="www.robocup-us.org/press/">www.robocup-us.org/press/</a> .</p><p>Other major sponsors include CITIZEN, Lockheed Martin, Microsoft and the National Science Foundation. </p><p>This summer is Robot Summer at Georgia Tech. In addition to RoboCup 2007 Atlanta, Georgia Tech will also host several other robotics-related events, including the Robotics: Science and Systems (RSS) conference and an International Aerial Robotics Competition. </p><p><strong>RoboCup 2007 Atlanta Schedule:</strong><br />July 3: RoboCup Opening Ceremony<br />July 3-6: RoboCup Qualifying Competitions<br />July 7-8: RoboCup Finals<br />July 9-10: RoboCup Symposium </p><p><strong>About RoboCup:</strong><br />RoboCup is an international research and education initiative. Its goal is to foster artificial intelligence and robotics research by providing a standard problem where a wide range of technologies can be examined and integrated. The concept of soccer-playing robots was first introduced in 1993. In July 1997, the first official conference and games were held in Nagoya, Japan, followed by Paris, Stockholm, Melbourne, Seattle, Fukuoka/Busan, Padua, Lisbon, Osaka and Bremen. This year, the 11th anniversary of RoboCup, the competition and symposium are being held in Atlanta, Georgia. For more details about RoboCup 2007 including participants and updated schedule, visit <a href="http://www.robocup-us.org/" title="http://www.robocup-us.org/">http://www.robocup-us.org/</a>.</p>]]></body>  <author>David Terraso</author>  <status>1</status>  <created>1178496000</created>  <gmt_created>2007-05-07 00:00:00</gmt_created>  <changed>1475895650</changed>  <gmt_changed>2016-10-08 03:00:50</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Tech to host world's largest robotics competition]]></teaser>  <type>news</type>  <sentence><![CDATA[Tech to host world's largest robotics competition]]></sentence>  <summary><![CDATA[The countdown begins for RoboCup 2007 Atlanta. The world's most renowned competition for research robotics, RoboCup 2007 Atlanta will be held at Georgia Tech July 3-10.]]></summary>  <dateline>2007-05-09T00:00:00-04:00</dateline>  <iso_dateline>2007-05-09T00:00:00-04:00</iso_dateline>  <gmt_dateline>2007-05-09 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Soccer-Playing and Search-and-Rescue Robots to Compete in World's Largest Robotics Competition in July]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[press@robocup-us.org]]></email>  <location></location>  <contact><![CDATA[<strong>Rebecca Biggs</strong><br />GCI Group<br /><a href="mailto:press@robocup-us.org">Contact Rebecca Biggs</a><br /><strong>404-260-3510</strong>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>72175</item>      </media>  <hg_media>          <item>          <nid>72175</nid>          <type>image</type>          <title><![CDATA[RoboCup]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177434</created>          <gmt_created>2015-12-03 21:17:14</gmt_created>          <changed>1475894651</changed>          <gmt_changed>2016-10-08 02:44:11</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.robocup-us.org/press/]]></url>        <title><![CDATA[RoboCup Press Registration]]></title>      </link>          <link>        <url><![CDATA[http://www.kukarobotics.com/]]></url>        <title><![CDATA[KUKA Robotics]]></title>      </link>          <link>        <url><![CDATA[http://www.robotics.gatech.edu/]]></url>        <title><![CDATA[Robotics at Georgia Tech]]></title>      </link>          <link>        <url><![CDATA[http://www.robocup.org/]]></url>        <title><![CDATA[RoboCup]]></title>      </link>          <link>        <url><![CDATA[http://www.robocup-us.org/]]></url>        <title><![CDATA[RoboCup 2007 Atlanta]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="133"><![CDATA[Special Events and Guest Speakers]]></category>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="133"><![CDATA[Special Events and Guest Speakers]]></term>          <term tid="153"><![CDATA[Computer Science/Information Technology and Security]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="439"><![CDATA[computer]]></keyword>          <keyword tid="208"><![CDATA[computing]]></keyword>          <keyword tid="2353"><![CDATA[robocup]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="167723"><![CDATA[soccer]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node></nodes>