<nodes> <node id="485281">  <title><![CDATA[Welcome to the Robot Zoo]]></title>  <uid>28075</uid>  <body><![CDATA[<p>Life is good for Georgia Tech's roboticists.</p><p>Buoyed by growing interest in the field, the Institute's robotics research has earned accolades around the world, and a few robots have become stars themselves. (You’ve probably seen coverage of Ayanna Howard’s math-tutor bot or Magnus Egerstedt’s dancing humanoids in your Facebook feed.)</p><p>But robots are expensive, and not every aspiring engineer can work in the gilded labs of Georgia Tech. And that’s where Egerstedt, the Schlumberger Professor in the School of Electrical and Computer Engineering, comes in.</p><p>About a year ago, he had an idea: What if he could break the barriers that keep people out of his field by building a robotics playground for everyone? He mulled over the logistics and, after persuading a few professors and Ph.D. students to join him, he planned the Robotarium.</p><p>If all goes according to his designs, the Robotarium will become Georgia Tech’s robot zoo, a home to machines of all shapes and sizes. They’ll be accessible to anyone in the world, which means remote users will be able to upload their own code, run their own experiments, and test their own ideas.</p><p>Sound extreme? It is. But Egerstedt never lets that stand in his way.</p><p>“This is going to go big,” he promises.</p><p><strong>Tearing Down the Wall</strong><br />The possibilities stretch in every direction. If Egerstedt can fill his menagerie with a diverse collection of machines, the Robotarium could become a lab for both basic tests and high-level research. That means, Egerstedt says, that the project might entice everyone from middle school science students to professors like him and his collaborators.</p><p>Aaron Ames, an associate professor in ECE and the Woodruff School of Mechanical Engineering, is one of those collaborators. Like Egerstedt, he’s frustrated that so few people have access to pricey hardware – the linchpin behind most robotics research.</p><p>“That’s the wall that prevents most academic work from translating to the commercial domain to the everyday-life domain,” Ames says, “and this will break that open. This will tear down that wall.”</p><p>Along with Ames, Egerstedt also enlisted the help of Professors Raheem Beyah (of ECE), and Eric Feron (from the School of Aerospace Engineering), and Blair MacIntyre (from the School of Interactive Computing) to make the idea a reality. The National Science Foundation awarded the team $2.5a million dollars to kick-start the work.</p><p>A few Ph.D. students are also helping outinvolved. Chief among them is Daniel Pickem, who is studying robotic self-assembly under Egerstedt and ECE Professor Jeff Shamma. He shares Egerstedt’s vision for what the Robotarium could become.</p><p>“I think it’s going to be a powerful paradigm: maintenance-free, hassle-free robotics,” Pickem says.</p><p><strong>The Long View</strong><br />Right now, Pickem spends many of his days debugging code and tweaking the boards of GRITSBots, tiny robots designed in Egerstedt’s Georgia Robotics and Intelligent Systems Lab. These creatures live on a large table that is, in a way, the first incarnation of the Robotarium.</p><p>Its gleaming white surface makes it resemble an air hockey table. But there’s important work being done here: The GRITSBots can move and interact with each other based on remote users’ controls. The table offers a glimpse of the Robotarium in miniature, and it allows Egerstedt and his colleagues to anticipate potential problems with a facility that’s accessible to anyone.</p><p>A key concern is safety, which is being overseen by Ames.</p><p>“The first thing that’s going to happen when you open it to the public is someone is going to try to break it,” he acknowledges. He’s already developed an algorithm to prevent robots from colliding with each other, but there’s a lot more work to come.</p><p>Today, there is just the white table. Egerstedt estimates another three to five years could pass before the full Robotarium is complete.</p><p>He likes taking the long view. Though he is known for championing novel – and sometimes untested – ways to make robotics more accessible, his ideas are informed by his past experiences.</p><p>As the professor of one of Georgia Tech’s early massive open online courses (MOOCs), he aimed to make advanced controls coursework available to anyone. After that, he contemplated using the principles of the MOOC for larger projects.</p><p>“I was thinking: What does a MOOC look like in research?” he says. (A robot zoo, apparently.)</p><p><strong>The Crystal Cathedral</strong> <br />But what’s in it for Georgia Tech and the College of Engineering? A lot of exposure, of course, but also the chance to be at the vanguard of robotics. Ames points out that if Georgia Tech unlocks the doors to its advanced machinery, it could set off a sea change in the field.</p><p>There is also the appeal of sheer theatrics, which could captivate people who might not otherwise be interested.</p><p>“Part of the vision is almost performance art,” Egerstedt says. Once the Robotarium is operating at peak capacity, its robots will be visible to anyone with an Internet connection, so they “should always be on and doing something compelling.”</p><p>Eventually, he envisions the Robotarium as a “crystal cathedral” smack in the center of the campus, where students and professors will have front-row seats to its humanoids, flying machines, and other wonders.</p><p>Again, it sounds extreme. But if anyone can get it done, it’s probably Egerstedt, one of Georgia Tech’s most effective preachers of the gospel of robotics.</p><p>That doesn’t mean he doesn’t expect some resistance along the way, though, and he knows there will only be one way to appease the Robotarium’s naysayers.</p><p>“The only weapon,” Egerstedt says, “is success.”</p>]]></body>  <author>Lyndsey Lewis</author>  <status>1</status>  <created>1452617257</created>  <gmt_created>2016-01-12 16:47:37</gmt_created>  <changed>1475896824</changed>  <gmt_changed>2016-10-08 03:20:24</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Professor Magnus Egerstedt's aims to make robots accessible to almost anyone.]]></teaser>  <type>news</type>  <sentence><![CDATA[Professor Magnus Egerstedt's aims to make robots accessible to almost anyone.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2016-01-12T00:00:00-05:00</dateline>  <iso_dateline>2016-01-12T00:00:00-05:00</iso_dateline>  <gmt_dateline>2016-01-12 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Inside Georgia Tech's Robotarium]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[lyndseylewis@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Lyndsey Lewis</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>485251</item>          <item>485261</item>          <item>485271</item>      </media>  <hg_media>          <item>          <nid>485251</nid>          <type>image</type>          <title><![CDATA[CoE Robotarium 2016 robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[gatech_robotarium_bbb6933_1200w.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/gatech_robotarium_bbb6933_1200w_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/gatech_robotarium_bbb6933_1200w_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/gatech_robotarium_bbb6933_1200w_0.jpg?itok=phXIlgCc]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[CoE Robotarium 2016 robot]]></image_alt>                    <created>1452898800</created>          <gmt_created>2016-01-15 23:00:00</gmt_created>          <changed>1475895239</changed>          <gmt_changed>2016-10-08 02:53:59</gmt_changed>      </item>          <item>          <nid>485261</nid>          <type>image</type>          <title><![CDATA[CoE Robotarium 2016 professors]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[gatech_robotarium_bbb6876_1200w.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/gatech_robotarium_bbb6876_1200w_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/gatech_robotarium_bbb6876_1200w_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/gatech_robotarium_bbb6876_1200w_0.jpg?itok=lUmTTXD4]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[CoE Robotarium 2016 professors]]></image_alt>                    <created>1452898800</created>          <gmt_created>2016-01-15 23:00:00</gmt_created>          <changed>1475895239</changed>          <gmt_changed>2016-10-08 02:53:59</gmt_changed>      </item>          <item>          <nid>485271</nid>          <type>image</type>          <title><![CDATA[CoE Robotarium 2016 table]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[gatech_robotarium_bbb6719_1200w.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/gatech_robotarium_bbb6719_1200w_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/gatech_robotarium_bbb6719_1200w_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/gatech_robotarium_bbb6719_1200w_0.jpg?itok=ZgLVLLbw]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[CoE Robotarium 2016 table]]></image_alt>                    <created>1452898800</created>          <gmt_created>2016-01-15 23:00:00</gmt_created>          <changed>1475895239</changed>          <gmt_changed>2016-10-08 02:53:59</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1237"><![CDATA[College of Engineering]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="129561"><![CDATA[Aaron Ames]]></keyword>          <keyword tid="1600"><![CDATA[Blair MacIntrye]]></keyword>          <keyword tid="130241"><![CDATA[Eric Feron]]></keyword>          <keyword tid="11528"><![CDATA[Magnus Egerstedt]]></keyword>          <keyword tid="67741"><![CDATA[Raheem Beyah]]></keyword>          <keyword tid="169814"><![CDATA[Robotarium]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>      </keywords>  <core_research_areas>          <term tid="145171"><![CDATA[Cybersecurity]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="485741">  <title><![CDATA[New Lab to Give Nation’s Researchers Remote Access to Robots]]></title>  <uid>27560</uid>  <body><![CDATA[<p>The Georgia Institute of Technology is building a new lab that will allow roboticists from around the country to conduct experiments remotely. Researchers from other universities, as well as middle and high school students, will schedule experiments, upload their own programming code, watch the robots in real-time via streamed video feeds and receive scientific data demonstrating the results.</p><p>The “Robotarium” is expected to house up to 100 ground and aerial swarm robots. No other university has a similar facility.</p><p>“Building and maintaining a world-class, multi-robot lab is too expensive for a large number of roboticists and budding roboticists. This creates a steep barrier to entry into our field,” said Magnus Egerstedt, Schlumberger Professor in the School of Electrical and Computer Engineering (ECE). “We need to provide more access to more people in order to continue creating robot-assisted technologies. The Robotarium will allow that.”</p><p>Egerstedt will lead the project, which includes several Georgia Tech faculty members who will also have access to the facility for their own multidisciplinary experiments and curriculum. The team has already created a mini-version of the Robotarium. Georgia Tech graduate students used it to complete their robotics projects. Researchers from the University of California, San Diego, successfully uploaded code during a recent test session.</p><p>Access is only one goal of the project. &nbsp;</p><p>“A research instrument like the Robotarium has the potential to build stronger networks of collaborative research, making the whole significantly larger than the sum of its parts,” he said. “The end result has the potential to show how remote access instruments can be structured in other areas beyond robotics.”</p><p>The National Science Foundation is helping to fund the project with two grants totaling $2.5 million. Georgia Tech will transform an existing classroom into the new lab. Georgia Tech will use the other award to help create safe and secure open-access systems for the remote lab.</p><p>“The first thing that’s going to happen when you open it to the public is someone is going to try to break it,” said Aaron Ames, an associate professor in the Woodruff School of Mechanical Engineering and ECE who’s involved in the project. Ames has already developed an algorithm to prevent robots from colliding with each other.</p><p>The Robotarium is expected to be fully operational in 2017.</p><p>“It’s going to be a room where robots are always roaming around,” said Egerstedt. “Georgia Tech students will be able to hang out and watch research that is happening across the country and beyond.”</p><p><em>This research is supported by the National Science Foundation (NSF) through grant numbers ECCS-1531195 and CNS 1544332. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NSF.</em></p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1452705141</created>  <gmt_created>2016-01-13 17:12:21</gmt_created>  <changed>1475896824</changed>  <gmt_changed>2016-10-08 03:20:24</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new lab on campus will allow scientists around the country to upload programs and run experiments remotely.]]></teaser>  <type>news</type>  <sentence><![CDATA[A new lab on campus will allow scientists around the country to upload programs and run experiments remotely.]]></sentence>  <summary><![CDATA[<p>The Georgia Institute of Technology is building a new lab that will allow roboticists from around the country to conduct experiments remotely. Researchers from other universities, as well as middle and high school students, will schedule experiments, upload their own programming code, watch the robots in real-time via streamed video feeds and receive scientific data demonstrating the results.</p>]]></summary>  <dateline>2016-01-13T00:00:00-05:00</dateline>  <iso_dateline>2016-01-13T00:00:00-05:00</iso_dateline>  <gmt_dateline>2016-01-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[“Robotarium” will allow greater access and collaboration]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />National Media Relations<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-385-2926</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>485731</item>          <item>485721</item>          <item>485711</item>          <item>485691</item>      </media>  <hg_media>          <item>          <nid>485731</nid>          <type>image</type>          <title><![CDATA[Current Version of Robotarium]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[robatarium_students.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/robatarium_students_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/robatarium_students_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/robatarium_students_0.jpg?itok=W9sfzFcv]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Current Version of Robotarium]]></image_alt>                    <created>1452902401</created>          <gmt_created>2016-01-16 00:00:01</gmt_created>          <changed>1475895239</changed>          <gmt_changed>2016-10-08 02:53:59</gmt_changed>      </item>          <item>          <nid>485721</nid>          <type>image</type>          <title><![CDATA[Small Robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[small_robot_robotarium.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/small_robot_robotarium_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/small_robot_robotarium_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/small_robot_robotarium_0.jpg?itok=8O0m5HRD]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Small Robot]]></image_alt>                    <created>1452902401</created>          <gmt_created>2016-01-16 00:00:01</gmt_created>          <changed>1475895239</changed>          <gmt_changed>2016-10-08 02:53:59</gmt_changed>      </item>          <item>          <nid>485711</nid>          <type>image</type>          <title><![CDATA[Faculty Overseeing Project]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[robatarium_faculty.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/robatarium_faculty_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/robatarium_faculty_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/robatarium_faculty_0.jpg?itok=fi2jmQQD]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Faculty Overseeing Project]]></image_alt>                    <created>1452898800</created>          <gmt_created>2016-01-15 23:00:00</gmt_created>          <changed>1475895239</changed>          <gmt_changed>2016-10-08 02:53:59</gmt_changed>      </item>          <item>          <nid>485691</nid>          <type>image</type>          <title><![CDATA[Group with Mini-Lab]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[group_robotarium.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/group_robotarium_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/group_robotarium_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/group_robotarium_0.jpg?itok=K0LycoQJ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Group with Mini-Lab]]></image_alt>                    <created>1452898800</created>          <gmt_created>2016-01-15 23:00:00</gmt_created>          <changed>1475895239</changed>          <gmt_changed>2016-10-08 02:53:59</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://magazine.coe.gatech.edu/feature/welcome-robot-zoo]]></url>        <title><![CDATA[Read More about the Robotarium]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1183"><![CDATA[Home]]></group>      </groups>  <categories>      </categories>  <news_terms>      </news_terms>  <keywords>          <keyword tid="11528"><![CDATA[Magnus Egerstedt]]></keyword>          <keyword tid="9848"><![CDATA[remote]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="486481">  <title><![CDATA[Meet Dr. Ayanna Howard: Roboticist, AI Scientist, and Old School #Blerd]]></title>  <uid>27241</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>It’s not every day you meet a sister who not only builds robots, is an expert in Artificial Intelligence, and worked for NASA’s Jet Propulsion Lab, but is also a down-to-earth, humorous, old-school Blerd (Black nerd) who was inspired by The Bionic Woman, Wonder Woman, and all things Sci-Fi as a little girl.</p>]]></body>  <author>Jackie Nemeth</author>  <status>1</status>  <created>1452785243</created>  <gmt_created>2016-01-14 15:27:23</gmt_created>  <changed>1475893675</changed>  <gmt_changed>2016-10-08 02:27:55</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Asynchronous]]></publication>  <article_dateline>2016-01-14T00:00:00-05:00</article_dateline>  <iso_article_dateline>2016-01-14T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2016-01-14T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[http://www.blackenterprise.com/technology/meet-dr-ayanna-howard-roboticist-ai-scientist-and-old-school-blerd/]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="1255"><![CDATA[School of Electrical and Computer Engineering]]></group>      </groups>  <categories>      </categories>  <keywords>          <keyword tid="825"><![CDATA[Ayanna Howard]]></keyword>          <keyword tid="109"><![CDATA[Georgia Tech]]></keyword>          <keyword tid="67281"><![CDATA[Human-Automation Systems Lab]]></keyword>          <keyword tid="166855"><![CDATA[School of Electrical and Computer Engineering]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="487971">  <title><![CDATA[Five Faculty Members Selected as CSTAR 2016 Summer Fellows@JPL]]></title>  <uid>28808</uid>  <body><![CDATA[<p>The Center for Space Technology and Research (CSTAR) has recently announced the selection of five Georgia Tech faculty members as 2016 Summer Fellows@JPL. The fellows will spend a portion of this summer on-site at the NASA Jet Propulsion Laboratory (JPL) pursuing research collaborations that advance the frontiers of space science and space technology.</p><p>The CSTAR Summer Fellows@JPL program is designed to promote and encourage collaboration between Georgia Tech and JPL, focusing on research collaborations in science and engineering fields of mutual interest. Prospective proposers are encouraged to submit research proposals that align with CSTAR research thrusts, identify JPL interests and collaborators, and describe the downstream impact of the summer collaborations. The yearly proposal cycle includes a November proposal call, December proposal submission, and January proposal selection. All Georgia Tech academic faculty are eligible to apply.</p><p>“These competitively awarded research grants build on the strengths of both JPL and Georgia Tech,” said Georgia Tech Professor and CSTAR Director Robert Braun. “They are designed to foster future research collaborations between these two institutions, and are well aligned with our nation’s future needs in space science and space technology.”</p><p>The faculty selected for the 2016 Summer Fellows@JPL program and their research topics are as follows:</p><ul><li><strong>Mark Costello (AE):</strong><em> Robotic Legged Landing Gear for Spacecraft</em></li><li><strong>Glenn Lightsey (AE):</strong><em> Design Drivers and Solutions for Reliable Small Satellite Science Missions</em></li><li><strong>Julian Rimoli (AE):</strong> <em>Tensegrity Structures for Planetary Landing</em></li><li><strong>Paul Steffes (ECE):</strong><em> Improved Radio Occultation Retrievals Of Terrestrial Atmospheric Structure And Composition</em></li><li><strong>Panos Tsiotras (AE):</strong><em> Covariance Steering Theory for Precise GN&amp;C Terrain Relative Navigation during Entry, Descent and Landing</em></li></ul><p>"The CSTAR Summer Fellows@JPL program provides an excellent opportunity to connect leading Georgia Tech faculty with researchers at JPL,” said JPL Chief Scientist Daniel McCleese. “The exciting projects chosen this year will open up new collaborations, and enhance both JPL and Georgia Tech's space science and engineering efforts.” In 2012 Georgia Tech and JPL entered into a strategic partnership designed to promote and encourage collaboration between the institutions. CSTAR serves as the Georgia Tech focal point for this newly established partnership with JPL.</p><p>The Center for Space Technology and Research (CSTAR) is an interdisciplinary research center that serves to organize, integrate and facilitate the impact of Georgia Tech's space science and space technology research activities. CSTAR brings together a wide range of Georgia Tech faculty, active in space science and space technology research, and functions as the Georgia Tech focal point for growth of the space industry in the state of Georgia. CSTAR is led by Dr. Braun who serves as the Director and Dr. Thomas Orlando, Associate Director.</p><p>For more information about the Center for Space Technology and Research (CSTAR), visit:</p><p><a href="http://www.cstar.gatech.edu/">www.cstar.gatech.edu</a></p>]]></body>  <author>Brandon Sforzo</author>  <status>1</status>  <created>1453224991</created>  <gmt_created>2016-01-19 17:36:31</gmt_created>  <changed>1475896827</changed>  <gmt_changed>2016-10-08 03:20:27</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Five Faculty Members Selected as CSTAR 2016 Summer Fellows@JPL]]></teaser>  <type>news</type>  <sentence><![CDATA[Five Faculty Members Selected as CSTAR 2016 Summer Fellows@JPL]]></sentence>  <summary><![CDATA[<p>We are pleased to announce that five faculty members have been competitively selected as participants in the Georgia Tech - JPL summer 2016 faculty collaboration program sponsored by the Georgia Tech Center for Space Technology and Research and the NASA Jet Propulsion Laboratory.</p>]]></summary>  <dateline>2016-01-19T00:00:00-05:00</dateline>  <iso_dateline>2016-01-19T00:00:00-05:00</iso_dateline>  <gmt_dateline>2016-01-19 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>475531</item>      </media>  <hg_media>          <item>          <nid>475531</nid>          <type>image</type>          <title><![CDATA[JPL Logo]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[logo-jpl.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/logo-jpl_0.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/logo-jpl_0.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/logo-jpl_0.png?itok=qvvyyBn-]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[JPL Logo]]></image_alt>                    <created>1449257215</created>          <gmt_created>2015-12-04 19:26:55</gmt_created>          <changed>1475895227</changed>          <gmt_changed>2016-10-08 02:53:47</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="282661"><![CDATA[Center for Space Technology and Research (CSTAR)]]></group>      </groups>  <categories>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>          <term tid="39541"><![CDATA[Systems]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="495111">  <title><![CDATA[Ashok Goel Explains How AI is Making Higher Education Smarter]]></title>  <uid>28124</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Ashok Goel foresees a meaningful creative shift in higher education as AI permeates everywhere.</p>]]></body>  <author>Tyler Sharp</author>  <status>1</status>  <created>1454579469</created>  <gmt_created>2016-02-04 09:51:09</gmt_created>  <changed>1475893675</changed>  <gmt_changed>2016-10-08 02:27:55</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Mohammadreza (Reza) Zandehshahvar]]></publication>  <article_dateline>2016-02-01T00:00:00-05:00</article_dateline>  <iso_article_dateline>2016-02-01T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2016-02-01T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[https://www.universitybusiness.com/article/how-artificial-intelligence-makes-higher-ed-smarter]]></article_url>  <media>          <item><![CDATA[469221]]></item>      </media>  <hg_media>          <item>          <nid>469221</nid>          <type>image</type>          <title><![CDATA[Ashok Goel]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ashok_goel_teaching2_cr.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ashok_goel_teaching2_cr_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ashok_goel_teaching2_cr_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ashok_goel_teaching2_cr_0.jpg?itok=-AgpxFkG]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ashok Goel]]></image_alt>                              <created>1449257160</created>          <gmt_created>2015-12-04 19:26:00</gmt_created>          <changed>1475895218</changed>          <gmt_changed>2016-10-08 02:53:38</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>      </categories>  <keywords>          <keyword tid="2835"><![CDATA[ai]]></keyword>          <keyword tid="112431"><![CDATA[ashok goel]]></keyword>          <keyword tid="166848"><![CDATA[School of Interactive Computing]]></keyword>          <keyword tid="169099"><![CDATA[University Business Magazine]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="499291">  <title><![CDATA[Ron Arkin Explains the Regulatory Divide About Google&#039;s Self-Driving Cars]]></title>  <uid>28124</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Ron Arkin explains how Google is rethinking their approach to regulatory matters with their self-driving cars.&nbsp;</p>]]></body>  <author>Tyler Sharp</author>  <status>1</status>  <created>1455200990</created>  <gmt_created>2016-02-11 14:29:50</gmt_created>  <changed>1475893675</changed>  <gmt_changed>2016-10-08 02:27:55</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Would never be part of any organization that would have me as a member]]></publication>  <article_dateline>2016-02-11T00:00:00-05:00</article_dateline>  <iso_article_dateline>2016-02-11T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2016-02-11T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[http://www.nytimes.com/2016/02/11/technology/nhtsa-blurs-the-line-between-human-and-computer-drivers.html]]></article_url>  <media>          <item><![CDATA[350021]]></item>      </media>  <hg_media>          <item>          <nid>350021</nid>          <type>image</type>          <title><![CDATA[Ron Arkin compressed]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ron-arkin_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ron-arkin_0_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ron-arkin_0_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ron-arkin_0_0.jpg?itok=ZK6jwIts]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ron Arkin compressed]]></image_alt>                              <created>1449245702</created>          <gmt_created>2015-12-04 16:15:02</gmt_created>          <changed>1475895075</changed>          <gmt_changed>2016-10-08 02:51:15</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>      </categories>  <keywords>          <keyword tid="82411"><![CDATA[autonomous cars]]></keyword>          <keyword tid="3165"><![CDATA[google]]></keyword>          <keyword tid="14444"><![CDATA[ron arkin]]></keyword>          <keyword tid="166848"><![CDATA[School of Interactive Computing]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="499521">  <title><![CDATA[Using Stories to Teach Human Values to Artificial Agents]]></title>  <uid>27490</uid>  <body><![CDATA[<p><strong>ATLANTA </strong>—<strong> Feb. 12, 2016 </strong>— The rapid pace of artificial intelligence (AI) has raised fears about whether robots could act unethically or soon choose to harm humans. Some are calling for bans on robotics research; others are calling for more research to understand how AI might be constrained. But how can robots learn ethical behavior if there is no “user manual” for being human?</p><p>Researchers <a href="https://research.cc.gatech.edu/inc/mark-riedl"><strong>Mark Riedl </strong></a>and <strong><a href="http://www.brenteharrison.com/">Brent Harrison</a></strong> from the School of Interactive Computing at the Georgia Institute of Technology believe the answer lies in “Quixote” – to be unveiled at the <a href="http://www.aaai.org/Conferences/AAAI/aaai16.php">AAAI-16 Conference</a> in Phoenix, Ariz. (Feb. 12 – 17). Quixote teaches “value alignment” to robots by training them to read stories, learn acceptable sequences of events and understand successful ways to behave in human societies.</p><p>“The collected stories of different cultures teach children how to behave in socially acceptable ways with examples of proper and improper behavior in fables, novels and other literature,” says Riedl, associate professor and director of the Entertainment Intelligence Lab. “We believe story comprehension in robots can eliminate psychotic-appearing behavior and reinforce choices that won’t harm humans and still achieve the intended purpose.”</p><p>Quixote is a technique for aligning an AI’s goals with human values by placing rewards on socially appropriate behavior. It builds upon Riedl’s prior research – the <a href="http://www.news.gatech.edu/2015/09/01/georgia-tech-uses-artificial-intelligence-crowdsource-interactive-fiction">Scheherazade system</a> – which demonstrated how artificial intelligence can gather a correct sequence of actions by crowdsourcing story plots from the Internet.</p><p>Scheherazade learns what is a normal or “correct” plot graph. It then passes that data structure along to Quixote, which converts it into a “reward signal” that reinforces certain behaviors and punishes other behaviors during trial-and-error learning. In essence, Quixote learns that it will be rewarded whenever it acts like the protagonist in a story instead of randomly or like the antagonist.</p><p>For example, if a robot is tasked with picking up a prescription for a human as quickly as possible, the robot could a) rob the pharmacy, take the medicine, and run; b) interact politely with the pharmacists, or c) wait in line. Without value alignment and positive reinforcement, the robot would learn that robbing is the fastest and cheapest way to accomplish its task. With value alignment from Quixote, the robot would be rewarded for waiting patiently in line and paying for the prescription.</p><p>Riedl and Harrison demonstrate in their research how a value-aligned reward signal can be produced to uncover all possible steps in a given scenario, map them into a plot trajectory tree, which is then used by the robotic agent to make “plot choices” (akin to what humans might remember as a Choose-Your-Own-Adventure novel) and receive rewards or punishments based on its choice.</p><p>The Quixote technique is best for robots that have a limited purpose but need to interact with humans to achieve it, and it is a primitive first step toward general moral reasoning in AI, Riedl says.</p><p>“We believe that AI has to be enculturated to adopt the values of a particular society, and in doing so, it will strive to avoid unacceptable behavior,” he adds. “Giving robots the ability to read and understand our stories may be the most expedient means in the absence of a human user manual.”</p><p><strong><a href="http://www.cc.gatech.edu/%7Eriedl/pubs/aaai-ethics16.pdf">Download</a> the complete research paper.</strong></p><p><em>This project undertaken was or is sponsored by the U.S. Defense Advanced Research Projects Agency (DARPA) under grant #D11AP00270 and the Office of Naval Research (ONR) under grant #N00014-14-1-0003. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the view of DARPA or the ONR.<br /></em></p>]]></body>  <author>Tara La Bouff</author>  <status>1</status>  <created>1455273113</created>  <gmt_created>2016-02-12 10:31:53</gmt_created>  <changed>1475896842</changed>  <gmt_changed>2016-10-08 03:20:42</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers from Interactive Computing unveil “Quixote” to teach AI positive behavior.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers from Interactive Computing unveil “Quixote” to teach AI positive behavior.]]></sentence>  <summary><![CDATA[<p>The rapid pace of artificial intelligence (AI) has raised fears about whether robots could act unethically to harm humans. But how can robots learn ethical behavior if there is no “user manual” for being human? Researchers <a href="https://research.cc.gatech.edu/inc/mark-riedl"><strong>Mark Riedl </strong></a>and <strong><a href="http://www.brenteharrison.com/">Brent Harrison</a></strong> from the School of Interactive Computing believe the answer lies in “Quixote” – unveiled at the <a href="http://www.aaai.org/Conferences/AAAI/aaai16.php">AAAI-16 Conference</a> in Phoenix, Ariz. (Feb. 12 – 17).</p>]]></summary>  <dateline>2016-02-12T00:00:00-05:00</dateline>  <iso_dateline>2016-02-12T00:00:00-05:00</iso_dateline>  <gmt_dateline>2016-02-12 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[tlabouff@cc.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:tlabouff@cc.gatech.edu"><strong>Tara La </strong><strong>Bouff</strong></a><br /> Communications Manager<br /> 404-894-7253 (Office)<br /> 404-769-5408 (Mobile)</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>499531</item>          <item>499551</item>      </media>  <hg_media>          <item>          <nid>499531</nid>          <type>image</type>          <title><![CDATA[Mark Riedl portrait]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[riedl_protrait_web.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/riedl_protrait_web.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/riedl_protrait_web.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/riedl_protrait_web.jpg?itok=l3uBkIUr]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mark Riedl portrait]]></image_alt>                    <created>1455332400</created>          <gmt_created>2016-02-13 03:00:00</gmt_created>          <changed>1475895258</changed>          <gmt_changed>2016-10-08 02:54:18</gmt_changed>      </item>          <item>          <nid>499551</nid>          <type>image</type>          <title><![CDATA[Quixote flow chart]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[quixote_-_.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/quixote_-_.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/quixote_-_.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/quixote_-_.jpg?itok=RkgdRHU8]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Quixote flow chart]]></image_alt>                    <created>1455332400</created>          <gmt_created>2016-02-13 03:00:00</gmt_created>          <changed>1475895258</changed>          <gmt_changed>2016-10-08 02:54:18</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="169135"><![CDATA[Brent Harrison]]></keyword>          <keyword tid="66281"><![CDATA[Mark Riedl]]></keyword>          <keyword tid="166848"><![CDATA[School of Interactive Computing]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>          <topic tid="71901"><![CDATA[Society and Culture]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="503171">  <title><![CDATA[Zyrobotics wins $750K National Science Foundation grant]]></title>  <uid>28137</uid>  <body><![CDATA[<p>The National Science Foundation (NSF) awarded Zyrobotics a $750,000 Small Business Innovation Research (SBIR) Phase II grant that continues the startup’s work in developing an accessible educational platform for children with special needs.</p><p>Launched in September 2013 by Ayanna Howard, the&nbsp;Linda J. and Mark C. Smith Chair professor in the Georgia Institute of Technology’s School of Electrical and Computer Engineering, the company is commercializing assistive technology that enables children with limited mobility to operate tablet computers, smartphones, toys, gaming apps, and interactive robots.</p><p>“We are extremely excited about the opportunities that this NSF SBIR grant provides,” said Howard, who is the company’s chief technology officer. “It helps Zyrobotics to continue to evolve as a leader in inclusive smart mobile technologies by enhancing our ability to develop accessible learning systems that&nbsp;engage and empower children with special needs and enhance their quality of life.”</p><p>Specifically, the Phase II project aims to focus on the development of an accessible educational platform that combines mobile interfaces and adaptive educational tablet applications (apps) to support the requirements of children with special needs. While tablet devices have given those children an interactive experience that has revolutionized their learning, in its proposal, Zyrobotics notes that while&nbsp;some&nbsp;tablet devices are intuitive in use and easy for lots of kids, those with disabilities are largely overlooked due to difficulties in effecting pinch-and-swipe gestures.</p><p>“This project thus addresses a direct need in our society by providing an integrated educational experience, focused on math education that addresses the diverse needs of children, while providing a solution for variations found in their disabilities,” the company wrote in its grant proposal. “This SBIR Phase II project addresses an unmet need by developing an innovative solution to enable children with motor disabilities access to mobile devices and apps that could engage them fully into the educational system.”</p><p>In this next phase, Howard and her team plan to design accessible math apps geared to children with or without disabilities in kindergarten through 12th grade. The company also plans to&nbsp;design another set of apps that adapt educational content and provide feedback to parents and teachers based on real-time analytics.</p><p>The company says it sees ample market opportunity for its products both domestically and abroad. Here in the United States, children with disabilities are entitled to a free and appropriate public education, and Zyrobotics sees its products as addressing that need from both a commercial and societal standpoint. Worldwide, more than&nbsp;93 million children live with a disability.</p><p>When founded, the company went through Georgia Tech’s&nbsp;VentureLab&nbsp;startup incubator, ranked No. 2 in North America. VentureLab, a unit of Tech’s Enterprise Innovation Institute (EI<sup>2</sup>), works with Georgia Tech faculty, students, and staff to help them validate and commercialize their research and ideas into viable companies.</p><p>Zyrobotics is now part of Tech’s Advanced Technology Development Center (ATDC), a sister startup incubator program that serves all of Georgia. Zyrobotics, with the help of ATDC’s SBIR program, was able to receive its Phase I award in 2015, laying the groundwork for the Phase II grant.</p><p>“Zyrobotics is a wonderful Georgia Tech startup, based on the fine research in Dr. Howard’s lab, and enhanced by a very successful journey through the NSF I-Corps program,” said Keith McGreggor, VentureLab’s director. “This is a great example of how the research done in the classroom and lab, followed by idea validation, can lead to real breakthroughs that are designed to have a lasting impact on the lives touched by the technologies that Dr. Howard has created.”</p><p>— Péralte C. Paul</p>]]></body>  <author>Péralte Paul</author>  <status>1</status>  <created>1455815270</created>  <gmt_created>2016-02-18 17:07:50</gmt_created>  <changed>1475896849</changed>  <gmt_changed>2016-10-08 03:20:49</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Focus is continued development of accessible education platforms for children with special needs.]]></teaser>  <type>news</type>  <sentence><![CDATA[Focus is continued development of accessible education platforms for children with special needs.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2016-02-18T00:00:00-05:00</dateline>  <iso_dateline>2016-02-18T00:00:00-05:00</iso_dateline>  <gmt_dateline>2016-02-18 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[peralte.paul@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Laura Diamond</p><p>Georgia Tech Media Relations&nbsp;</p><p><a href="mailto:laura.diamond@gatech.edu">laura.diamond@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>313961</item>      </media>  <hg_media>          <item>          <nid>313961</nid>          <type>image</type>          <title><![CDATA[Ayanna Howard]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ayannahoward131021br295_web.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ayannahoward131021br295_web_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ayannahoward131021br295_web_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ayannahoward131021br295_web_0.jpg?itok=JCrhrR_w]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ayanna Howard]]></image_alt>                    <created>1449244929</created>          <gmt_created>2015-12-04 16:02:09</gmt_created>          <changed>1475895022</changed>          <gmt_changed>2016-10-08 02:50:22</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="139"><![CDATA[Business]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="139"><![CDATA[Business]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="4238"><![CDATA[atdc]]></keyword>          <keyword tid="825"><![CDATA[Ayanna Howard]]></keyword>          <keyword tid="363"><![CDATA[NSF]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="167833"><![CDATA[SBIR]]></keyword>          <keyword tid="4193"><![CDATA[venturelab]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="106361"><![CDATA[Business and Economic Development]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="507551">  <title><![CDATA[People Will Follow a Robot In an Emergency – Even If It’s Wrong]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>People entrust technology to handle many duties, but a team of Georgia Tech grad students discover the dangerous risk of people's faith in technology.</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1456772309</created>  <gmt_created>2016-02-29 18:58:29</gmt_created>  <changed>1475893678</changed>  <gmt_changed>2016-10-08 02:27:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[People Will Follow a Robot In an Emergency – Even If It’s Wrong]]></publication>  <article_dateline>2016-02-29T00:00:00-05:00</article_dateline>  <iso_article_dateline>2016-02-29T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2016-02-29T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[https://www.newscientist.com/article/2078945-people-will-follow-a-robot-in-an-emergency-even-if-its-wrong]]></article_url>  <media>          <item><![CDATA[507541]]></item>      </media>  <hg_media>          <item>          <nid>507541</nid>          <type>image</type>          <title><![CDATA[Rescue Robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[rescue-robot4-1200x800.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/rescue-robot4-1200x800.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/rescue-robot4-1200x800.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/rescue-robot4-1200x800.jpg?itok=4ZJuXAlY]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Rescue Robot]]></image_alt>                              <created>1457114400</created>          <gmt_created>2016-03-04 18:00:00</gmt_created>          <changed>1475895268</changed>          <gmt_changed>2016-10-08 02:54:28</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>      </categories>  <keywords>          <keyword tid="1234"><![CDATA[emergency]]></keyword>          <keyword tid="2554"><![CDATA[rescue]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="2552"><![CDATA[robotic]]></keyword>          <keyword tid="167032"><![CDATA[study]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="508981">  <title><![CDATA[Robot Revolution Powering Up at Denver Museum]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>The successful “Robot Revolution” exhibition — launched last year at the Museum of Science and Industry, Chicago (MSI) and advised by Henrik Christensen — is headed to Denver. The MSI team collaborated with a renowned group of robotics experts to offer insight on the content, including Christensen, KUKA chair of robotics at the College of Computing at the Georgia Institute of Technology and executive director of the Institute for Robotics and Intelligent Machines.</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1457017890</created>  <gmt_created>2016-03-03 15:11:30</gmt_created>  <changed>1475893678</changed>  <gmt_changed>2016-10-08 02:27:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[rediscovery]]></publication>  <article_dateline>2016-03-02T00:00:00-05:00</article_dateline>  <iso_article_dateline>2016-03-02T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2016-03-02T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[http://www.innovationews.com/Robot-Revolution-powering-up-in-Denver/]]></article_url>  <media>          <item><![CDATA[50683]]></item>      </media>  <hg_media>          <item>          <nid>50683</nid>          <type>image</type>          <title><![CDATA[Henrik Christensen]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[henrik-christensen.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/henrik-christensen_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/henrik-christensen_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/henrik-christensen_1.jpg?itok=IBxkd5cm]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Henrik Christensen]]></image_alt>                              <created>1449175421</created>          <gmt_created>2015-12-03 20:43:41</gmt_created>          <changed>1475894466</changed>          <gmt_changed>2016-10-08 02:41:06</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="100871"><![CDATA[Intelligent Machines (IRIM)]]></keyword>          <keyword tid="78271"><![CDATA[IRIM]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="508901">  <title><![CDATA[Ayanna Howard Chosen for Prestigious CRA Honor]]></title>  <uid>27241</uid>  <body><![CDATA[<p>Ayanna Howard has been selected as the 2016 recipient of the A. Nico Habermann Award, given by the Computing Research Association (CRA). She will be presented with this honor during the CRA Conference to be held July 17-19 in Snowbird, Utah.</p><p>Howard was chosen for this award for her sustained commitment to increasing diversity, combined with her distinction in robotics research. She has a long track record of improving access to research for women and underrepresented minorities, as well as students with disabilities.</p><p>Dating back to her work at NASA Jet Propulsion Laboratory in the late 1990s, Howard ran a mentoring program for undergraduate women, and continuing today, she works on increasing minority participation at the graduate, undergraduate, and high school levels. She has provided research opportunities to dozens of undergraduates, of whom 75 percent are underrepresented minorities and/or women.</p><p>Howard has been on the Georgia Tech School of Electrical and Computer Engineering faculty since 2005, where she currently holds the Linda J. and Mark C. Smith Chair and leads the Human-Automation Systems Laboratory. She also heads up a multidisciplinary team from Georgia Tech and Emory University that will create new bachelor’s, master’s, and doctoral degree programs and concentrations in healthcare robotics – the first degree programs in this area in the United States.</p><p>In the last year, Howard was recognized as one of the 23 most powerful women engineers in the world by <em>Business Insider</em> and was named to <em>The Root 2015</em>, a list of 100 African-Americans responsible for the year’s most significant moments, movements, and ideas.</p>]]></body>  <author>Jackie Nemeth</author>  <status>1</status>  <created>1457005484</created>  <gmt_created>2016-03-03 11:44:44</gmt_created>  <changed>1475896857</changed>  <gmt_changed>2016-10-08 03:20:57</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[ECE Professor Ayanna Howard has been selected as the 2016 recipient of the A. Nico Habermann Award, given by the Computing Research Association (CRA).]]></teaser>  <type>news</type>  <sentence><![CDATA[ECE Professor Ayanna Howard has been selected as the 2016 recipient of the A. Nico Habermann Award, given by the Computing Research Association (CRA).]]></sentence>  <summary><![CDATA[<p>ECE Professor Ayanna Howard has been selected as the 2016 recipient of the A. Nico Habermann Award, given by the Computing Research Association (CRA).&nbsp;</p>]]></summary>  <dateline>2016-03-03T00:00:00-05:00</dateline>  <iso_dateline>2016-03-03T00:00:00-05:00</iso_dateline>  <gmt_dateline>2016-03-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jackie.nemeth@ece.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jackie Nemeth</p><p>School of Electrical and Computer Engineering</p><p>404-894-2906</p><p><a href="mailto:jackie.nemeth@ece.gatech.edu">jackie.nemeth@ece.gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>333601</item>      </media>  <hg_media>          <item>          <nid>333601</nid>          <type>image</type>          <title><![CDATA[Ayanna Howard]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ahoward-robotics-280x250.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ahoward-robotics-280x250_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ahoward-robotics-280x250_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ahoward-robotics-280x250_0.jpg?itok=9gcYIu0y]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ayanna Howard]]></image_alt>                    <created>1449245133</created>          <gmt_created>2015-12-04 16:05:33</gmt_created>          <changed>1475895044</changed>          <gmt_changed>2016-10-08 02:50:44</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.ece.gatech.edu/]]></url>        <title><![CDATA[School of Electrical and Computer Engineering]]></title>      </link>          <link>        <url><![CDATA[http://www.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech]]></title>      </link>          <link>        <url><![CDATA[http://humanslab.ece.gatech.edu/humanslab/Home.html]]></url>        <title><![CDATA[Human-Automation Systems Lab]]></title>      </link>          <link>        <url><![CDATA[https://www.ece.gatech.edu/faculty-staff-directory/ayanna-maccalla-howard]]></url>        <title><![CDATA[Ayanna Howard]]></title>      </link>          <link>        <url><![CDATA[http://cra.org/]]></url>        <title><![CDATA[Computing Research Association]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1255"><![CDATA[School of Electrical and Computer Engineering]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="134"><![CDATA[Student and Faculty]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="171782"><![CDATA[A. Nico Habermann Award]]></keyword>          <keyword tid="825"><![CDATA[Ayanna Howard]]></keyword>          <keyword tid="101271"><![CDATA[Computing Research Association]]></keyword>          <keyword tid="109"><![CDATA[Georgia Tech]]></keyword>          <keyword tid="13086"><![CDATA[Human-Automation Systems Laboratory]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="166855"><![CDATA[School of Electrical and Computer Engineering]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="511791">  <title><![CDATA[Inside the Artificial Intelligence Revolution: A Special Report, Pt. 2]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Georgia Tech Professor Ronald Arkin speaks with RollingStone about advancements in artificial intelligence, proper machine ethics, and how to avoid creating "Terminator-style weapons."&nbsp;</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1457617749</created>  <gmt_created>2016-03-10 13:49:09</gmt_created>  <changed>1475893681</changed>  <gmt_changed>2016-10-08 02:28:01</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Adriana Lucia-Sanz]]></publication>  <article_dateline>2016-03-10T00:00:00-05:00</article_dateline>  <iso_article_dateline>2016-03-10T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2016-03-10T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[https://www.rollingstone.com/culture/features/inside-the-artificial-intelligence-revolution-a-special-report-pt-2-20160309?page=11]]></article_url>  <media>          <item><![CDATA[350021]]></item>      </media>  <hg_media>          <item>          <nid>350021</nid>          <type>image</type>          <title><![CDATA[Ron Arkin compressed]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ron-arkin_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ron-arkin_0_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ron-arkin_0_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ron-arkin_0_0.jpg?itok=ZK6jwIts]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ron Arkin compressed]]></image_alt>                              <created>1449245702</created>          <gmt_created>2015-12-04 16:15:02</gmt_created>          <changed>1475895075</changed>          <gmt_changed>2016-10-08 02:51:15</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="2835"><![CDATA[ai]]></keyword>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="169117"><![CDATA[machine ethics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>          <keyword tid="169118"><![CDATA[rollingstone]]></keyword>          <keyword tid="623"><![CDATA[Technology]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="517541">  <title><![CDATA[I Think Microsoft&#039;s A.I. Chatbot is Flirting with Me]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>After Microsoft's A.I. Chatbot, "Tay," was taken offline for sending out hate-laden tweets, Popular Science cites Professor Mark Riedl and his School of Interactive Computing research about teaching artificial intelligence our social cues through storytelling to help prevent offensive behavior.</p><p>&nbsp;</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1458903841</created>  <gmt_created>2016-03-25 11:04:01</gmt_created>  <changed>1475893684</changed>  <gmt_changed>2016-10-08 02:28:04</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[VLSI symposium]]></publication>  <article_dateline>2016-03-24T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-03-24T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-03-24T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.popsci.com/i-think-microsofts-ai-chatbot-is-flirting-with-me]]></article_url>  <media>          <item><![CDATA[499531]]></item>      </media>  <hg_media>          <item>          <nid>499531</nid>          <type>image</type>          <title><![CDATA[Mark Riedl portrait]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[riedl_protrait_web.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/riedl_protrait_web.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/riedl_protrait_web.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/riedl_protrait_web.jpg?itok=l3uBkIUr]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mark Riedl portrait]]></image_alt>                              <created>1455332400</created>          <gmt_created>2016-02-13 03:00:00</gmt_created>          <changed>1475895258</changed>          <gmt_changed>2016-10-08 02:54:18</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="42901"><![CDATA[Community]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="2835"><![CDATA[ai]]></keyword>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="66281"><![CDATA[Mark Riedl]]></keyword>          <keyword tid="89691"><![CDATA[popular science]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="529071">  <title><![CDATA[IRIM Welcomes iRobot as a New Industrial Partner]]></title>  <uid>27255</uid>  <body><![CDATA[<p>The Institute for Robotics and Intelligent Machines (IRIM) at Georgia Tech welcomes <a href="http://www.irobot.com/" target="_blank">iRobot</a> to its Industrial Partners Program.</p><p>Founded in 1990 by Massachusetts Institute of Technology roboticists with the vision of making practical robots a reality,&nbsp;iRobot designs and builds robots that make a difference. In 2015, the company generated $617 million in revenue and employed more than 500 of the robot industry's top professionals, including mechanical, electrical and software engineers, and related support staff. iRobot stock trades on the NASDAQ stock market under the ticker symbol IRBT.</p><p>iRobot's corporate headquarters are located in Bedford, Mass. The company also has offices in California, the United Kingdom, China, and Hong Kong.</p><p>IRIM’s industry partners are well positioned to take advantage of new technologies as they are developed, and its collaborative research environment provides companies with unprecedented access to some of the world’s best computer scientists and engineers. For more information about the Industrial Partners Program, please contact Associate Director of Industry&nbsp;<a href="mailto:gary.mcmurray@gtri.gatech.edu">Gary V. McMurray</a>.</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1461594932</created>  <gmt_created>2016-04-25 14:35:32</gmt_created>  <changed>1475896888</changed>  <gmt_changed>2016-10-08 03:21:28</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[IRIM Welcomes iRobot to its Industrial Partners Program.]]></teaser>  <type>news</type>  <sentence><![CDATA[IRIM Welcomes iRobot to its Industrial Partners Program.]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2016-03-30T00:00:00-04:00</dateline>  <iso_dateline>2016-03-30T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-03-30 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Josie Giles<br />IRIM Marketing Communications<br /><a href="mailto:josie@gatech.edu">josie@gatech.edu</a><br />404-385-8551</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>529091</item>      </media>  <hg_media>          <item>          <nid>529091</nid>          <type>image</type>          <title><![CDATA[Colin Angle]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[colinangle-chairmanoftheboard2cceoandco-founder28129.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/colinangle-chairmanoftheboard2cceoandco-founder28129.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/colinangle-chairmanoftheboard2cceoandco-founder28129.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/colinangle-chairmanoftheboard2cceoandco-founder28129.jpg?itok=KrOj9BrY]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Colin Angle]]></image_alt>                    <created>1461895200</created>          <gmt_created>2016-04-29 02:00:00</gmt_created>          <changed>1475895307</changed>          <gmt_changed>2016-10-08 02:55:07</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.irobot.com/]]></url>        <title><![CDATA[iRobot]]></title>      </link>          <link>        <url><![CDATA[http://robotics.gatech.edu/industry/program]]></url>        <title><![CDATA[Industrial Partners Program]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="10285"><![CDATA[Industrial Partners Program]]></keyword>          <keyword tid="78281"><![CDATA[Institute for Robotics &amp; Intelligent Machines]]></keyword>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="520601">  <title><![CDATA[Why Microsoft Believes AI Bots Are Logical Path Forward]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>The Quixote system -- a tool for enculturating artificial intelligence developed by School of Interactive Computing Professor Mark Riedl and Research Scientist Brent Harrison Quixote -- is cited in this Daily Dot article that investigates Microsoft's plans to integrate "bots" into commonly used programs.</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1459509780</created>  <gmt_created>2016-04-01 11:23:00</gmt_created>  <changed>1475893684</changed>  <gmt_changed>2016-10-08 02:28:04</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Regents’ Entrepreneurx]]></publication>  <article_dateline>2016-04-01T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-04-01T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-04-01T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.dailydot.com/technology/microsoft-bots-smarterchild-future/]]></article_url>  <media>          <item><![CDATA[443661]]></item>      </media>  <hg_media>          <item>          <nid>443661</nid>          <type>image</type>          <title><![CDATA[Mark Riedl]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[5-mark_riedl.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/5-mark_riedl_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/5-mark_riedl_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/5-mark_riedl_0.jpg?itok=Kmjjg8SU]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mark Riedl]]></image_alt>                              <created>1449256205</created>          <gmt_created>2015-12-04 19:10:05</gmt_created>          <changed>1475895182</changed>          <gmt_changed>2016-10-08 02:53:02</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="2835"><![CDATA[ai]]></keyword>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="169135"><![CDATA[Brent Harrison]]></keyword>          <keyword tid="66281"><![CDATA[Mark Riedl]]></keyword>          <keyword tid="335"><![CDATA[Microsoft]]></keyword>          <keyword tid="169136"><![CDATA[Quixote]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="521061">  <title><![CDATA[Video: Rodney Brooks Presents First Kelly Distinguished Lecture on Robots and Jobs]]></title>  <uid>27255</uid>  <body><![CDATA[<p>IRIM’s new lecture series, the Kelly Distinguished Lecture on Robots and Jobs, features preeminent scholars in fields of significance to robotics. The visiting lecturers, in addition to presenting seminars on topics relevant to robots in the workplace, participate in informal discussions with Georgia Tech faculty and students.</p><ul><li>Watch the&nbsp;<a href="http://robotics.gatech.edu/sites/default/files/videos/RodneyBrooksLecture_final.mp4">video</a>.</li><li>To learn more about the lecture series and read&nbsp;the lecture transcript, please visit the&nbsp;<a href="http://robotics.gatech.edu/outreach/kellylecture">IRIM website</a>.</li></ul>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1459697680</created>  <gmt_created>2016-04-03 15:34:40</gmt_created>  <changed>1475896874</changed>  <gmt_changed>2016-10-08 03:21:14</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Dr. Rodney Brooks presents “The Case for More Robots” on Friday, March 11]]></teaser>  <type>news</type>  <sentence><![CDATA[Dr. Rodney Brooks presents “The Case for More Robots” on Friday, March 11]]></sentence>  <summary><![CDATA[<p>IRIM’s new lecture series, the Kelly Distinguished Lecture on Robots and Jobs, features preeminent scholars in fields of significance to robotics. The visiting lecturers, in addition to presenting seminars on topics relevant to robots in the workplace, participate in informal discussions with Georgia Tech faculty and students.</p><ul><li>Watch the <a href="http://robotics.gatech.edu/sites/default/files/videos/RodneyBrooksLecture_final.mp4">video</a>.</li><li>To learn more about the lecture series and read&nbsp;the lecture transcript, please visit the <a href="http://robotics.gatech.edu/outreach/kellylecture">IRIM website</a>.</li></ul>]]></summary>  <dateline>2016-04-03T00:00:00-04:00</dateline>  <iso_dateline>2016-04-03T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-04-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Josie Giles<br />IRIM Marketing Communications Mgr.<br /><a href="mailto:josie@gatech.edu">josie@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>487961</item>          <item>510691</item>      </media>  <hg_media>          <item>          <nid>487961</nid>          <type>image</type>          <title><![CDATA[Rodney Brooks]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[brooks.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/brooks_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/brooks_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/brooks_0.jpg?itok=iriP632u]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Rodney Brooks]]></image_alt>                    <created>1453309200</created>          <gmt_created>2016-01-20 17:00:00</gmt_created>          <changed>1475895242</changed>          <gmt_changed>2016-10-08 02:54:02</gmt_changed>      </item>          <item>          <nid>510691</nid>          <type>image</type>          <title><![CDATA[Kelly Distinguished Lecture on Robots and Jobs]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[kelly_poster-01.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/kelly_poster-01.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/kelly_poster-01.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/kelly_poster-01.png?itok=ncw_MMh5]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Kelly Distinguished Lecture on Robots and Jobs]]></image_alt>                    <created>1458923712</created>          <gmt_created>2016-03-25 16:35:12</gmt_created>          <changed>1475895273</changed>          <gmt_changed>2016-10-08 02:54:33</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://robotics.gatech.edu/hg/item/487951]]></url>        <title><![CDATA[Kelly Distinguished Lecture Event Page]]></title>      </link>          <link>        <url><![CDATA[http://www.rethinkrobotics.com/]]></url>        <title><![CDATA[Rethink Robotics]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="133"><![CDATA[Special Events and Guest Speakers]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="133"><![CDATA[Special Events and Guest Speakers]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="84901"><![CDATA[grad students]]></keyword>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>          <keyword tid="171881"><![CDATA[Kelly Distinguished Lecture]]></keyword>          <keyword tid="2437"><![CDATA[lecture]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>          <keyword tid="168220"><![CDATA[Rodney Brooks]]></keyword>          <keyword tid="166896"><![CDATA[seminar]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="521561">  <title><![CDATA[Can we trust robots to make moral decisions?]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Professor Ron Arkin sheds light on how humans' moral vagueness led to Microsoft's chatbot Tay's (in)famous Internet debut, and how his research can help provide artificial intelligence with a comprehensive ethical code.</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1459788327</created>  <gmt_created>2016-04-04 16:45:27</gmt_created>  <changed>1475893684</changed>  <gmt_changed>2016-10-08 02:28:04</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[memoirs]]></publication>  <article_dateline>2016-04-04T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-04-04T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-04-04T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://qz.com/653575/can-we-trust-robots-to-make-moral-decisions/]]></article_url>  <media>          <item><![CDATA[350021]]></item>      </media>  <hg_media>          <item>          <nid>350021</nid>          <type>image</type>          <title><![CDATA[Ron Arkin compressed]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ron-arkin_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ron-arkin_0_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ron-arkin_0_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ron-arkin_0_0.jpg?itok=ZK6jwIts]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ron Arkin compressed]]></image_alt>                              <created>1449245702</created>          <gmt_created>2015-12-04 16:15:02</gmt_created>          <changed>1475895075</changed>          <gmt_changed>2016-10-08 02:51:15</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="2835"><![CDATA[ai]]></keyword>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>          <keyword tid="14444"><![CDATA[ron arkin]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="522081">  <title><![CDATA[Rude Bot Rises]]></title>  <uid>27174</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>In this podcast, Professor Charles Isbell talks about the definitions and limits of artificial intelligence, drawing on his own research with the chatbot "Cobot."</p>]]></body>  <author>Mike Terrazas</author>  <status>1</status>  <created>1459938158</created>  <gmt_created>2016-04-06 10:22:38</gmt_created>  <changed>1475893684</changed>  <gmt_changed>2016-10-08 02:28:04</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Yi-Hsien Stephanie Ho]]></publication>  <article_dateline>2016-04-05T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-04-05T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-04-05T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.flashforwardpod.com/2016/04/05/episode-10-rude-bot-rises/]]></article_url>  <media>          <item><![CDATA[522091]]></item>      </media>  <hg_media>          <item>          <nid>522091</nid>          <type>image</type>          <title><![CDATA[Rude Bot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[rude_bot.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/rude_bot_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/rude_bot_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/rude_bot_0.jpg?itok=9Dl65uV_]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Rude Bot]]></image_alt>                              <created>1459972800</created>          <gmt_created>2016-04-06 20:00:00</gmt_created>          <changed>1475895291</changed>          <gmt_changed>2016-10-08 02:54:51</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>      </categories>  <keywords>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="35851"><![CDATA[Charles Isbell Jr.]]></keyword>          <keyword tid="169137"><![CDATA[chatbot]]></keyword>          <keyword tid="169138"><![CDATA[rude bot]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="550121">  <title><![CDATA[Lena Ting Inducted into Medical and Biological Engineering Elite]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<h1>&nbsp;</h1>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1467380024</created>  <gmt_created>2016-07-01 13:33:44</gmt_created>  <changed>1475893696</changed>  <gmt_changed>2016-10-08 02:28:16</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[professional master&#039;s in manufacturing leadership]]></publication>  <article_dateline>2016-04-06T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-04-06T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-04-06T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.news.emory.edu/stories/2016/04/lena_ting_aimbe_fellow/index.html]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>          <keyword tid="2266"><![CDATA[Lena Ting]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="527591">  <title><![CDATA[Unofficial Business: How to Stay Employed in a Robot World]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<h1 class="article-headline">&nbsp;</h1>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1461239625</created>  <gmt_created>2016-04-21 11:53:45</gmt_created>  <changed>1475893687</changed>  <gmt_changed>2016-10-08 02:28:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Pengcheng Zhang]]></publication>  <article_dateline>2016-04-21T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-04-21T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-04-21T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.myajc.com/news/business/unofficial-business-how-to-stay-employed-in-a-robo/nq8Zd/]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>      </categories>  <keywords>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="529111">  <title><![CDATA[Rolling Robots: Researchers Work to Avoid Potholes and Pitfalls on the Road to Autonomous Vehicles]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1461595938</created>  <gmt_created>2016-04-25 14:52:18</gmt_created>  <changed>1475893687</changed>  <gmt_changed>2016-10-08 02:28:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[similiac]]></publication>  <article_dateline>2016-04-21T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-04-21T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-04-21T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.rh.gatech.edu/features/rolling-robots]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>      </categories>  <keywords>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="531281">  <title><![CDATA[Weighing The Good And The Bad Of Autonomous Killer Robots In Battle]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Regents' Professor and Director of the&nbsp;<a href="http://www.cc.gatech.edu/ai/robot-lab/">Mobile Robot Laboratory</a>&nbsp;at Georgia Tech Ron Arkin, offers insight on how&nbsp;autonomous robots and similar devices can save military and civilian lives in future wars.</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1461938886</created>  <gmt_created>2016-04-29 14:08:06</gmt_created>  <changed>1475893687</changed>  <gmt_changed>2016-10-08 02:28:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[2022 Diversity Champion awards]]></publication>  <article_dateline>2016-04-29T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-04-29T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-04-29T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.npr.org/sections/alltechconsidered/2016/04/28/476055707/weighing-the-good-and-the-bad-of-autonomous-killer-robots-in-battle?utm_source=twitter.com&amp;utm_medium=social&amp;utm_campaign=npr&amp;utm_term=nprnews&amp;utm_content=20160428]]></article_url>  <media>          <item><![CDATA[350021]]></item>      </media>  <hg_media>          <item>          <nid>350021</nid>          <type>image</type>          <title><![CDATA[Ron Arkin compressed]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ron-arkin_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ron-arkin_0_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ron-arkin_0_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ron-arkin_0_0.jpg?itok=ZK6jwIts]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ron Arkin compressed]]></image_alt>                              <created>1449245702</created>          <gmt_created>2015-12-04 16:15:02</gmt_created>          <changed>1475895075</changed>          <gmt_changed>2016-10-08 02:51:15</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="169151"><![CDATA[Ron Arkins]]></keyword>          <keyword tid="4061"><![CDATA[War]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="533431">  <title><![CDATA[Ga. Tech Students Develop Public Art-Painting Robotic Slug]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Parts of Atlanta are covered in murals, but what if there were one on every surface of our city? That’s the dream of some Georgia Tech freshmen, who are building a&nbsp;wall-climbing, spray-painting robot called the Color Slug.</p><p>(Photo courtesy of&nbsp;Gabbie Watts, NPR)</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1462467485</created>  <gmt_created>2016-05-05 16:58:05</gmt_created>  <changed>1475893687</changed>  <gmt_changed>2016-10-08 02:28:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Ga. Tech Students Develop Public Art-Painting Robotic Slug]]></publication>  <article_dateline>2016-05-03T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-05-03T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-05-03T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://news.wabe.org/post/ga-tech-students-develop-public-art-painting-robotic-slug]]></article_url>  <media>          <item><![CDATA[533421]]></item>      </media>  <hg_media>          <item>          <nid>533421</nid>          <type>image</type>          <title><![CDATA[Color Slug]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[screen_shot_2016-05-05_at_4.55.56_pm.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/screen_shot_2016-05-05_at_4.55.56_pm_0.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/screen_shot_2016-05-05_at_4.55.56_pm_0.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/screen_shot_2016-05-05_at_4.55.56_pm_0.png?itok=GwYMgXw2]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Color Slug]]></image_alt>                              <created>1462561200</created>          <gmt_created>2016-05-06 19:00:00</gmt_created>          <changed>1475895314</changed>          <gmt_changed>2016-10-08 02:55:14</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>      </groups>  <categories>      </categories>  <keywords>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="533681">  <title><![CDATA[If Your Teacher Sounds Like a Robot, You Might Be On to Something]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Powered by IBM's&nbsp;Watson analytics system, Jill Watson -- an&nbsp;artificial-intelligence system and teaching assistant in Georgia Tech's OMS CS program&nbsp;-- has aided with Professor&nbsp;Ashok&nbsp;Goel's online course for months completely&nbsp;unbeknownst to students.</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1462536935</created>  <gmt_created>2016-05-06 12:15:35</gmt_created>  <changed>1475893687</changed>  <gmt_changed>2016-10-08 02:28:07</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[JAG]]></publication>  <article_dateline>2016-05-06T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-05-06T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-05-06T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.wsj.com/articles/if-your-teacher-sounds-like-a-robot-you-might-be-on-to-something-1462546621]]></article_url>  <media>          <item><![CDATA[477661]]></item>      </media>  <hg_media>          <item>          <nid>477661</nid>          <type>image</type>          <title><![CDATA[OMS CS Logo]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[oms-cs-web-banner-eblast_copy.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/oms-cs-web-banner-eblast_copy_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/oms-cs-web-banner-eblast_copy_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/oms-cs-web-banner-eblast_copy_0.jpg?itok=Ej_QOPJi]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[OMS CS Logo]]></image_alt>                              <created>1449766800</created>          <gmt_created>2015-12-10 17:00:00</gmt_created>          <changed>1475895230</changed>          <gmt_changed>2016-10-08 02:53:50</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50875"><![CDATA[School of Computer Science]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>          <group id="66244"><![CDATA[C21U]]></group>          <group id="431631"><![CDATA[OMS]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="2835"><![CDATA[ai]]></keyword>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="1051"><![CDATA[Computer Science]]></keyword>          <keyword tid="2523"><![CDATA[cs]]></keyword>          <keyword tid="67951"><![CDATA[OMS]]></keyword>          <keyword tid="66341"><![CDATA[OMS CS]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="533921">  <title><![CDATA[Artificial Intelligence Course Creates AI Teaching Assistant]]></title>  <uid>27560</uid>  <body><![CDATA[<p>College of Computing Professor Ashok Goel teaches Knowledge Based Artificial Intelligence (KBAI) every semester. It’s a core requirement of Georgia Tech’s online master’s of science in computer science program. And every time he offers it, Goel estimates, his 300 or so students post roughly 10,000 messages in the online forums — far too many inquiries for him and his eight teaching assistants (TA) to handle.</p><p>That’s why Goel added a ninth TA this semester. Her name is Jill Watson, and she’s unlike any other TA in the world. In fact, she’s not even a “she.” Jill is a computer — a virtual TA — implemented, in part, using technologies from <a href="https://www.ibm.com/smarterplanet/us/en/ibmwatson/developercloud/">IBM’s Watson platform</a>. </p><p>“The world is full of online classes, and they’re plagued with low retention rates,” Goel said. “One of the main reasons many students drop out is because they don’t receive enough teaching support. We created Jill as a way to provide faster answers and feedback.”</p><p>Goel and his team of Georgia Tech graduate students started to build her last year. They contacted Piazza, the course’s online discussion forum, to track down all the questions that had ever been asked in KBAI since the class was launched in fall 2014 (about 40,000 postings in all). Then they started to feed Jill the questions and answers.</p><p>“One of the secrets of online classes is that the number of questions increases if you have more students, but the number of <em>different </em>questions doesn’t really go up,” Goel said. “Students tend to ask the same questions over and over again.”</p><p class="Default">That’s an ideal situation to apply computing technologies like Watson. Goel tapped into IBM's open developer platform to identify Watson APIs for answering questions, adding Georgia Tech’s own processing modules to improve performance. The team then wrote code that allows Jill to field routine questions that are asked every semester. For example, students consistently ask where they can find particular assignments and readings.</p><p>Jill wasn’t very good for the first few weeks after she started in January, often giving odd and irrelevant answers. Her responses were posted in a forum that wasn’t visible to students.</p><p>“Initially her answers weren't good enough because she would get stuck on keywords,” said Lalith Polepeddi, one of the graduate students who co-developed the virtual TA. “For example, a student asked about organizing a meet-up to go over video lessons with others, and Jill gave an answer referencing a textbook that could supplement the video lessons — same keywords — but different context. So we learned from mistakes like this one, and gradually made Jill smarter.”</p><p>After some tinkering by the research team, Jill found her groove and soon was answering questions with 97 percent certainty. When she did, the human TAs would upload her responses to the students. By the end of March, Jill didn’t need any assistance: She wrote the class directly if she was 97 percent positive her answer was correct.</p><p>The students, who were studying artificial intelligence, were unknowingly interacting with it. Goel didn’t inform them about Jill's true identity until April 26. The student response was uniformly&nbsp;positive. One admitted her mind was blown. Another asked if Jill could “come out and play.” Since then some students have organized a KBAI alumni forum to learn about new developments with Jill after the class ends, and another group of students has launched an open source project to replicate her.</p><p>Back in February, student Tyson Bailey began to wonder if Jill was a computer and posted his suspicions on Piazza.</p><p>“We were taking an AI course, so I had to imagine that it was possible there might be an AI lurking around,” said Bailey, who lives in Albuquerque, New Mexico. “Then again, I asked Dr. Goel if he was a computer in one of my first email interactions with him. I think it’s a great idea and hope that they continue to improve it.”</p><p>Jill ended the semester able to answer many routine questions asked. She’ll return —with a different name — next semester. The goal is to have the virtual teaching assistant answer 40 percent of all questions by the end of year.</p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1462784213</created>  <gmt_created>2016-05-09 08:56:53</gmt_created>  <changed>1475896895</changed>  <gmt_changed>2016-10-08 03:21:35</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Professor surprises students by using computer as a teaching assistant.]]></teaser>  <type>news</type>  <sentence><![CDATA[Professor surprises students by using computer as a teaching assistant.]]></sentence>  <summary><![CDATA[<p>Professor Ashok Goel uses IBM's Watson platform to design Jill Watson, a virtual teaching assistant. She was one of nine TAs in Goel's artificial intelligence online course. He surprised his students at the end of the semester; no one guessed she wasn't a human.&nbsp;</p>]]></summary>  <dateline>2016-05-09T00:00:00-04:00</dateline>  <iso_dateline>2016-05-09T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-05-09 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Students didn’t know their TA was a computer]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />National Media Relations<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-660-2926</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>487761</item>          <item>533931</item>          <item>254861</item>      </media>  <hg_media>          <item>          <nid>487761</nid>          <type>image</type>          <title><![CDATA[Ashok Goel in the Classroom]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[16c10303-p20-005.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/16c10303-p20-005_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/16c10303-p20-005_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/16c10303-p20-005_0.jpg?itok=cGprVbU5]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1453233601</created>          <gmt_created>2016-01-19 20:00:01</gmt_created>          <changed>1475895242</changed>          <gmt_changed>2016-10-08 02:54:02</gmt_changed>      </item>          <item>          <nid>533931</nid>          <type>image</type>          <title><![CDATA[Artifical Intelligence]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1462892400</created>          <gmt_created>2016-05-10 15:00:00</gmt_created>          <changed>1475895317</changed>          <gmt_changed>2016-10-08 02:55:17</gmt_changed>      </item>          <item>          <nid>254861</nid>          <type>image</type>          <title><![CDATA[Ashok Goel_new]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[131021ar069.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/131021ar069_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/131021ar069_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/131021ar069_0.jpg?itok=qruxqmSd]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ashok Goel_new]]></image_alt>                    <created>1449243846</created>          <gmt_created>2015-12-04 15:44:06</gmt_created>          <changed>1475894934</changed>          <gmt_changed>2016-10-08 02:48:54</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://omscs.gatech.edu/]]></url>        <title><![CDATA[Learn About OMS CS]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>      </categories>  <news_terms>      </news_terms>  <keywords>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="169183"><![CDATA[Jill Watson]]></keyword>          <keyword tid="66341"><![CDATA[OMS CS]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>          <term tid="39541"><![CDATA[Systems]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="541561">  <title><![CDATA[At Georgia Tech, Your Robot Butler Is Ready to Serve (and Learn)]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<h1>&nbsp;</h1>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1464861865</created>  <gmt_created>2016-06-02 10:04:25</gmt_created>  <changed>1475893690</changed>  <gmt_changed>2016-10-08 02:28:10</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[lmc awards]]></publication>  <article_dateline>2016-05-13T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-05-13T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-05-13T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.pcmag.com/news/344391/at-georgia-tech-your-robot-butler-is-ready-to-serve-and-le]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>          <keyword tid="169047"><![CDATA[Sonia Chernova]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="541351">  <title><![CDATA[Autonomous Mini Rally Car Teaches Itself to Powerslide]]></title>  <uid>27560</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1464782618</created>  <gmt_created>2016-06-01 12:03:38</gmt_created>  <changed>1475893690</changed>  <gmt_changed>2016-10-08 02:28:10</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[ Linda Wills]]></publication>  <article_dateline>2016-05-18T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-05-18T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-05-18T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://spectrum.ieee.org/cars-that-think/transportation/self-driving/autonomous-mini-rally-car-teaches-itself-to-powerslide]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>      </categories>  <keywords>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="537611">  <title><![CDATA[Robocar Teaches Itself to Powerslide]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Georgia Tech researchers at the&nbsp;Institute for Robotics and Intelligent Machine have unleashed their new toy -- a self-driving&nbsp;“Robocar”-- &nbsp;that is testing&nbsp;algorithms that may teach future&nbsp;autonomous vehicles how to respond in aggressive driving situations.</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1463662166</created>  <gmt_created>2016-05-19 12:49:26</gmt_created>  <changed>1475893690</changed>  <gmt_changed>2016-10-08 02:28:10</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[ Linda Wills]]></publication>  <article_dateline>2016-05-20T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-05-20T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-05-20T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://spectrum.ieee.org/cars-that-think/transportation/self-driving/autonomous-mini-rally-car-teaches-itself-to-powerslide]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>          <group id="1299"><![CDATA[GVU Center]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="518"><![CDATA[cars]]></keyword>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>          <keyword tid="169163"><![CDATA[robocars]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="538261">  <title><![CDATA[A Professor Built an AI Bot to Make Teaching Easier. Will It Replace Him Someday?]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Georgia Tech Professor Ashok Goel used an artificial intelligence-based chatbot to help manage the thousands of questions sent by his 300 <a href="http://www.omscs.gatech.edu" target="_blank">OMS CS</a> students, but now people wonder if this AI teaching assistant could bring inexpensive, quality education to billions of people around the world.</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1463742897</created>  <gmt_created>2016-05-20 11:14:57</gmt_created>  <changed>1475893690</changed>  <gmt_changed>2016-10-08 02:28:10</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[memoirs]]></publication>  <article_dateline>2016-05-20T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-05-20T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-05-20T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://qz.com/688048/a-professor-built-an-ai-bot-to-make-teaching-easier-will-it-replace-him-someday/]]></article_url>  <media>          <item><![CDATA[469221]]></item>      </media>  <hg_media>          <item>          <nid>469221</nid>          <type>image</type>          <title><![CDATA[Ashok Goel]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ashok_goel_teaching2_cr.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ashok_goel_teaching2_cr_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ashok_goel_teaching2_cr_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ashok_goel_teaching2_cr_0.jpg?itok=-AgpxFkG]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ashok Goel]]></image_alt>                              <created>1449257160</created>          <gmt_created>2015-12-04 19:26:00</gmt_created>          <changed>1475895218</changed>          <gmt_changed>2016-10-08 02:53:38</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>      </categories>  <keywords>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="538611">  <title><![CDATA[New Technique Controls Autonomous Vehicles in Extreme Conditions]]></title>  <uid>27303</uid>  <body><![CDATA[<p>A Georgia Institute of Technology research team has devised a novel way to help keep a driverless vehicle under control as it maneuvers at the edge of its handling limits. The approach could help make self-driving cars of the future safer under hazardous road conditions.</p><p>Researchers from Georgia Tech’s Daniel Guggenheim School of Aerospace Engineering (AE) and the School of Interactive Computing (IC) have assessed the new technology by racing, sliding, and jumping one-fifth-scale, fully autonomous auto-rally cars at the equivalent of 90 mph. The technique uses advanced algorithms and onboard computing, in concert with installed sensing devices, to increase vehicular stability while maintaining performance.</p><p>The work, tested at the Georgia Tech Autonomous Racing Facility, is sponsored by the U.S. Army Research Office. A paper covering this research was presented at the recent International Conference on Robotics and Automation (ICRA), held May 16-21.</p><p>“An autonomous vehicle should be able to handle any condition, not just drive on the highway under normal conditions,” said Panagiotis Tsiotras, an AE professor who is an expert on the mathematics behind rally-car racing control. “One of our principal goals is to infuse some of the expert techniques of human drivers into the brains of these autonomous vehicles.”</p><p>Traditional robotic-vehicle techniques use the same control approach whether a vehicle is driving normally or at the edge of roadway adhesion, Tsiotras explained. The Georgia Tech method – known as model predictive path integral control (MPPI) – was developed specifically to address the non-linear dynamics involved in controlling a vehicle near its friction limits. <br /> <br /><strong>Utilizing Advanced Concepts</strong></p><p>“Aggressive driving in a robotic vehicle – maneuvering at the edge – is a unique control problem involving a highly complex system,” said Evangelos Theodorou, an AE assistant professor who is leading the project. “However, by merging statistical physics with control theory, and utilizing leading-edge computation, we can create a new perspective, a new framework, for control of autonomous systems.”</p><p>The Georgia Tech researchers used a stochastic trajectory-optimization capability, based on a path-integral approach, to create their MPPI control algorithm, Theodorou explained. Using statistical methods, the team integrated large amounts of handling-related information, together with data on the dynamics of the vehicular system, to compute the most stable trajectories from myriad possibilities.</p><p>Processed by the high-power graphics processing unit (GPU) that the vehicle carries, the MPPI control algorithm continuously samples data coming from global positioning system (GPS) hardware, inertial motion sensors, and other sensors. The onboard hardware-software system performs real-time analysis of a vast number of possible trajectories and relays optimal handling decisions to the vehicle moment by moment.</p><p>In essence, the MPPI approach combines both the planning and execution of optimized handling decisions into a single highly efficient phase. It’s regarded as the first technology to carry out this computationally demanding task; in the past, optimal- control data inputs could not be processed in real time.<br /> <br /><strong>Fully Autonomous Vehicles</strong></p><p>The researchers’ two auto-rally vehicles – custom built by the team – utilize special electric motors to achieve the right balance between weight and power. The cars carry a motherboard with a quad-core processor, a potent GPU, and a battery.</p><p>Each vehicle also has two forward-facing cameras, an inertial measurement unit, and a GPS receiver, along with sophisticated wheel-speed sensors. The power, navigation, and computation equipment is housed in a rugged aluminum enclosure able to withstand violent rollovers. Each vehicle weighs about 48 pounds and is about three feet long.</p><p>These rolling robots are able to test the team’s control algorithms without any need for off-vehicle devices or computation, except for a nearby GPS receiver. The onboard GPU lets the MPPI algorithm sample more than 2,500, 2.5-second-long trajectories in under 1/60 of a second.</p><p>An important aspect in the team’s autonomous-control approach centers on the concept of “costs” – key elements of system functionality. Several cost components must be carefully matched to achieve optimal performance.</p><p>In the case of the Georgia Tech vehicles, the costs consist of three main areas: the cost for staying on the track, the cost for achieving a desired velocity, and the cost of the control system. A sideslip-angle cost was also added to improve vehicle stability.</p><p>The cost approach is important to enabling a robotic vehicle to maximize speed while staying under control, explained James Rehg, a professor in the Georgia Tech School of Interactive Computing who is collaborating with Theodorou and Tsiotras.</p><p>It’s a complex balancing act, Rehg said. For example, when the researchers reduced one cost term to try to prevent vehicle sliding, they found they got increased drifting behavior.</p><p>“What we're talking about here is using the MPPI algorithm to achieve relative <br />entropy minimization – and adjusting costs in the most effective way is a big part of that,” he said. “To achieve the optimal combination of control and performance in an autonomous vehicle is definitely a non-trivial problem.”</p><p><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia 30332-0181 USA</strong></p><p><strong>Media Relations Contacts</strong>: Jason Maderer (<a href="mailto:jason.maderer@comm.gatech.edu">jason.maderer@comm.gatech.edu</a>) (404-385-2966) or John Toon (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>) (404-894-6986).</p><p><strong>Writer</strong>: Rick Robinson</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1463997854</created>  <gmt_created>2016-05-23 10:04:14</gmt_created>  <changed>1475896902</changed>  <gmt_changed>2016-10-08 03:21:42</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers have devised a novel way to help keep a driverless vehicle under control as it maneuvers at the edge of its handling limits.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers have devised a novel way to help keep a driverless vehicle under control as it maneuvers at the edge of its handling limits.]]></sentence>  <summary><![CDATA[<p>A Georgia Institute of Technology research team has devised a novel way to help keep a driverless vehicle under control as it maneuvers at the edge of its handling limits. The approach could help make self-driving cars of the future safer under hazardous road conditions.&nbsp;</p>]]></summary>  <dateline>2016-05-23T00:00:00-04:00</dateline>  <iso_dateline>2016-05-23T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-05-23 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[MPPI strategy helps self-driving, robotic vehicles maintain control at edge of handling limits]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jason.maderer@comm.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer</p><p><a href="mailto:jason.maderer@comm.gatech.edu">jason.maderer@comm.gatech.edu</a></p><p>(404) 385-2966</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>538541</item>          <item>538561</item>          <item>538571</item>      </media>  <hg_media>          <item>          <nid>538541</nid>          <type>image</type>          <title><![CDATA[autonomous racing vehicle]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autonomoous-racing1-horiz.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autonomoous-racing1-horiz.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/autonomoous-racing1-horiz.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autonomoous-racing1-horiz.jpg?itok=S3vKHyUj]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[autonomous racing vehicle]]></image_alt>                    <created>1464703200</created>          <gmt_created>2016-05-31 14:00:00</gmt_created>          <changed>1475895326</changed>          <gmt_changed>2016-10-08 02:55:26</gmt_changed>      </item>          <item>          <nid>538561</nid>          <type>image</type>          <title><![CDATA[Researchers with autonomous racing vehicle]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autonomous-racing2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autonomous-racing2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/autonomous-racing2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autonomous-racing2.jpg?itok=6wjNGZgU]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researchers with autonomous racing vehicle]]></image_alt>                    <created>1464703200</created>          <gmt_created>2016-05-31 14:00:00</gmt_created>          <changed>1475895326</changed>          <gmt_changed>2016-10-08 02:55:26</gmt_changed>      </item>          <item>          <nid>538571</nid>          <type>image</type>          <title><![CDATA[autonomous racing vehicle2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autonomous-racing1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autonomous-racing1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/autonomous-racing1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autonomous-racing1.jpg?itok=1D4XozQ1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[autonomous racing vehicle2]]></image_alt>                    <created>1464703200</created>          <gmt_created>2016-05-31 14:00:00</gmt_created>          <changed>1475895326</changed>          <gmt_changed>2016-10-08 02:55:26</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1299"><![CDATA[GVU Center]]></group>      </groups>  <categories>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="147"><![CDATA[Military Technology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="7264"><![CDATA[autonomous]]></keyword>          <keyword tid="97281"><![CDATA[autonomous vehicles]]></keyword>          <keyword tid="172051"><![CDATA[control system]]></keyword>          <keyword tid="170305"><![CDATA[driverless]]></keyword>          <keyword tid="205"><![CDATA[GPU]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="539831">  <title><![CDATA[The Soft Touch of Robotics]]></title>  <uid>28153</uid>  <body><![CDATA[<p class="p1">When he was a little kid, Frank L. Hammond III would watch the <em>Transformers</em> animated series or, if that wasn’t on, he’d watch <em>Challenge of the GoBots</em>. While both programs were created to help toy companies sell toys (Transformers and GoBots), they managed to do something else for Hammond. They nurtured his budding interest in robots.</p><p class="p1">“Ever since I was a kid I’ve been into robotics, and these shows featured robots that could change their form, which I thought was very cool,” says Hammond, assistant professor in the Wallace H. Coulter Department of Biomedical Engineering and the George W. Woodruff School of Mechanical Engineering.</p><p class="p1">“So I’ve always been interested in devices like these, that could change their form or function to fit the needs of the environment,” adds Hammond, who recently joined the multidisciplinary team of researchers at the Petit Institute for Bioengineering and Bioscience.&nbsp;</p><p class="p1">The overarching goal of his research is right there in the name of the lab he helms: the Adaptive Robotic Manipulation Lab.&nbsp;</p><p class="p1">“The inspiration for most of my research is getting robots involved in activities of daily life, creating a collaborative environment for humans and robots,” says Hammond. “You’ve seen a lot of the robots that are available commercially – robots that sweep floors for example, robots that are very good at doing a certain job very precisely, quickly and reliably, over and over again for tens of thousands of cycles.”</p><p class="p1">Instead, Hammond is interested in building machines capable of performing complex tasks in concert with their human hosts, wearable devices that are “adaptive and flexible, in direct contact with humans, capable of augmenting human motion.”</p><p class="p1">So Hammond shows off a hand-like device with two fingers and a thumb, soft to the touch, like skin – silicone digits formed in a 3D printed mold, actuated through pneumatic means via little CO2 cartridges.&nbsp;</p><p class="p1">“This is biologically inspired,” Hammond says. “And it’s a little safer than the rigid, electrically-powered devices. This mechanical hand won’t smash the desk if you move too quickly. It’s powered by air and is mechanically compliant, so it’s not going to damage the wearer or the environment. It’s too soft to do that.”</p><p class="p1">The replacement hand can grasp a coffee cup or a bottle of water, open a door, and what it may lack in precision (at this point), it makes up for in utility and cost. One of these devices costs less than $100 to make.</p><p class="p1">“I think the economy of this device, in general, is closely intertwined with the proliferation of self-robotic devices,” Hammond says. “Soft robotics is much more economical now that so many researchers and companies are doing it. Prices have fallen and there’s more competition.”</p><p class="p1">Hammond demonstrates the hand and its pneumatic actuator. It’s a self-attaching device – you can attach it using one able arm. He lowers air pressure or changes the air distribution to control how the fingers bend and move. And though Hammond isn’t saying his robotic hand is a full-on replacement for your real hand, it does offer the user some options that Mother Nature can’t.</p><p class="p1">“These are modular components,” he says. “So for example, if I wanted to have longer fingers, I could very easily swap these out with longer fingers.”</p><p class="p1">Hammond earned his Ph.D. at Carnegie Mellon, where he worked and studied in the department of mechanical engineering and The Robotics Institute. He did his postdoctoral work at Harvard and the Massachusetts Institute of Technology (MIT), where leading researchers are directing a number of projects aimed at giving humans extra capability.&nbsp;</p><p class="p1">Robotics researchers have coined a phrase, “supernumerary robotics.” Imagine having extra limbs (supernumerary robotic limbs, or SLRs), which are not designed to replace a missing limb, but to augment what you have.</p><p class="p1">“They’re still using rigid linkages and electric motors for most of the work, because these are very well known components, easy to characterize and control,” says Hammond, who arrived at Georgia Tech in time for last fall semester.&nbsp;</p><p class="p1">By contrast, he says, “soft robotics is a very nascent field. Controlling softer, compliant mechanisms is much harder.”&nbsp;</p><p class="p1">Hammond gets the appeal of soft robotics and soft sensing and wearable devices in general, adding, “but I also see the appeal of having extra robotic limbs or even mobile robots that can help us do things collaboratively. I’m trying to fuse all of that into my research efforts here at Georgia Tech.”</p><p class="p2"><em><strong><br /></strong></em></p><p class="p2"><a href="http://www.franklhammondiii.com/"><em><strong>Hammond's research website</strong></em></a></p><p class="p2"><strong><br /></strong></p><p class="p2"><strong>CONTACT:</strong></p><p class="p2"><a href="mailto:jerry.grillo@ibb.gatech.edu">Jerry Grillo</a><br />Communications Officer II<br />Parker H. Petit Institute for<br />Bioengineering and Bioscience</p>]]></body>  <author>Jerry Grillo</author>  <status>1</status>  <created>1464101977</created>  <gmt_created>2016-05-24 14:59:37</gmt_created>  <changed>1475896906</changed>  <gmt_changed>2016-10-08 03:21:46</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Hammond research focused on creating wearable, useable devices for humans]]></teaser>  <type>news</type>  <sentence><![CDATA[Hammond research focused on creating wearable, useable devices for humans]]></sentence>  <summary><![CDATA[<p class="p1">Hammond research focused on creating wearable, useable devices for humans</p>]]></summary>  <dateline>2016-05-24T00:00:00-04:00</dateline>  <iso_dateline>2016-05-24T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-05-24 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[Hammond research focused on creating wearable, useable devices for humans]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jerry.grillo@ibb.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><a href="mailto:jerry.grillo@ibb.gatech.edu">Jerry Grillo</a><br />Communications Officer II<br />Parker H. Petit Institute for<br />Bioengineering and Bioscience</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>539821</item>      </media>  <hg_media>          <item>          <nid>539821</nid>          <type>image</type>          <title><![CDATA[Frank Hammond III]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[frank_and_device_again.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/frank_and_device_again.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/frank_and_device_again.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/frank_and_device_again.jpg?itok=MoFOLaxZ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Frank Hammond III]]></image_alt>                    <created>1464706800</created>          <gmt_created>2016-05-31 15:00:00</gmt_created>          <changed>1475895329</changed>          <gmt_changed>2016-10-08 02:55:29</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1254"><![CDATA[Wallace H. Coulter Dept. of Biomedical Engineering]]></group>      </groups>  <categories>      </categories>  <news_terms>      </news_terms>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="172067"><![CDATA[wearable devices]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="541211">  <title><![CDATA[Should Robots Decide Who Lives or Dies?]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Artificial intelligence advances in military robots is redefining the meaning of warfare. As part of this article, Professor Ron Arkin&nbsp;addresses&nbsp;whether robotic weapons can discern the value of human life in the fog of war, and what rules will guide their decision-making process.</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1464774965</created>  <gmt_created>2016-06-01 09:56:05</gmt_created>  <changed>1475893690</changed>  <gmt_changed>2016-10-08 02:28:10</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[ Linda Wills]]></publication>  <article_dateline>2016-06-01T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-06-01T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-06-01T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://spectrum.ieee.org/robotics/military-robots/do-we-want-robot-warriors-to-decide-who-lives-or-dies]]></article_url>  <media>          <item><![CDATA[350021]]></item>      </media>  <hg_media>          <item>          <nid>350021</nid>          <type>image</type>          <title><![CDATA[Ron Arkin compressed]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ron-arkin_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ron-arkin_0_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ron-arkin_0_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ron-arkin_0_0.jpg?itok=ZK6jwIts]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ron Arkin compressed]]></image_alt>                              <created>1449245702</created>          <gmt_created>2015-12-04 16:15:02</gmt_created>          <changed>1475895075</changed>          <gmt_changed>2016-10-08 02:51:15</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50875"><![CDATA[School of Computer Science]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>          <group id="1299"><![CDATA[GVU Center]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>      </categories>  <keywords>          <keyword tid="433"><![CDATA[IC]]></keyword>          <keyword tid="169180"><![CDATA[robots. military]]></keyword>          <keyword tid="14444"><![CDATA[ron arkin]]></keyword>          <keyword tid="166848"><![CDATA[School of Interactive Computing]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="541681">  <title><![CDATA[Meet Jill Watson, your new robot teaching assistant]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>IBM CEO Ginni Rometty commented on Professor Ashok Goel's "Jill Watson" experiment and the growing importance of artificial intelligence.</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1464882523</created>  <gmt_created>2016-06-02 15:48:43</gmt_created>  <changed>1475893690</changed>  <gmt_changed>2016-10-08 02:28:10</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[andong Luo]]></publication>  <article_dateline>2016-06-02T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-06-02T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-06-02T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.recode.net/2016/6/1/11830980/ibm-watson-ai-teaching-assistant-jill-georgia-tech]]></article_url>  <media>          <item><![CDATA[477661]]></item>      </media>  <hg_media>          <item>          <nid>477661</nid>          <type>image</type>          <title><![CDATA[OMS CS Logo]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[oms-cs-web-banner-eblast_copy.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/oms-cs-web-banner-eblast_copy_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/oms-cs-web-banner-eblast_copy_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/oms-cs-web-banner-eblast_copy_0.jpg?itok=Ej_QOPJi]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[OMS CS Logo]]></image_alt>                              <created>1449766800</created>          <gmt_created>2015-12-10 17:00:00</gmt_created>          <changed>1475895230</changed>          <gmt_changed>2016-10-08 02:53:50</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>      </categories>  <keywords>          <keyword tid="2835"><![CDATA[ai]]></keyword>          <keyword tid="2556"><![CDATA[artificial intelligence]]></keyword>          <keyword tid="112431"><![CDATA[ashok goel]]></keyword>          <keyword tid="1126"><![CDATA[ibm]]></keyword>          <keyword tid="169183"><![CDATA[Jill Watson]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="542041">  <title><![CDATA[The Argument For Robots That Can Think, Decide — And Kill]]></title>  <uid>30267</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>War is inevitable. Professor&nbsp;Ron Arkin discusses how smarter and more precise&nbsp;weapons can make war safer and reduce casualties.</p>]]></body>  <author>Devin Young</author>  <status>1</status>  <created>1465210500</created>  <gmt_created>2016-06-06 10:55:00</gmt_created>  <changed>1475893690</changed>  <gmt_changed>2016-10-08 02:28:10</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Society for Human Resource Management]]></publication>  <article_dateline>2016-06-06T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-06-06T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-06-06T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.huffingtonpost.com/entry/lethal-autonomous-weapons-ronald-arkin_us_574ef3bbe4b0af73af95ea36]]></article_url>  <media>          <item><![CDATA[350021]]></item>      </media>  <hg_media>          <item>          <nid>350021</nid>          <type>image</type>          <title><![CDATA[Ron Arkin compressed]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ron-arkin_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ron-arkin_0_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/ron-arkin_0_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ron-arkin_0_0.jpg?itok=ZK6jwIts]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ron Arkin compressed]]></image_alt>                              <created>1449245702</created>          <gmt_created>2015-12-04 16:15:02</gmt_created>          <changed>1475895075</changed>          <gmt_changed>2016-10-08 02:51:15</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="34141"><![CDATA[Drones]]></keyword>          <keyword tid="433"><![CDATA[IC]]></keyword>          <keyword tid="2483"><![CDATA[interactive computing]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>          <keyword tid="169184"><![CDATA[ron akin]]></keyword>          <keyword tid="166848"><![CDATA[School of Interactive Computing]]></keyword>          <keyword tid="169185"><![CDATA[wars]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="550391">  <title><![CDATA[Robot Helps Study How First Land Animals Moved 360 Million Years Ago]]></title>  <uid>27303</uid>  <body><![CDATA[<p>When early terrestrial animals began moving about on mud and sand 360 million years ago, the powerful tails they used as fish may have been more important than scientists previously realized. That’s one conclusion from a new study of African mudskipper fish and a robot modeled on the animal.</p><p>Animals analogous to the mudskipper would have used modified fins to move around on flat surfaces, but for climbing sandy slopes, the animals could have benefitted from using their tails to propel themselves forward, the researchers found. Results of the study, reported July 8 in the journal <em>Science</em>, could help designers create amphibious robots able to move across granular surfaces more efficiently – and with less likelihood of getting stuck in the mud.</p><p>Sponsored by the National Science Foundation, the Army Research Office and the Army Research Laboratory, the project involved a multidisciplinary team of physicists, biologists and roboticists from the Georgia Institute of Technology, Clemson University and Carnegie Mellon University. In addition to a detailed study of the mudskipper and development of a robot model that used the animal’s locomotion techniques, the study also examined flow and drag conditions in representative granular materials, and applied a mathematical model incorporating new physics based on the drag research.</p><p>“Most robots have trouble moving on terrain that includes sandy slopes,” said Dan Goldman, an associate professor in the Georgia Tech School of Physics. “We noted that not only did the mudskippers use their limbs to propel themselves in a kind of crutching motion on sand and sandy slopes, but that when the going got tough, they used their tails in concert with limb propulsion to ascend a slope. Our robot model was only able to climb sandy slopes when it similarly used its tail in coordination with its appendages.”</p><p>Based on fossil records, scientists have long studied how early land animals may have gotten around, and the new study suggests their tails – which played a key role in swimming as fish – may have helped supplement the work of fins, especially on sloping granular surfaces such as beaches and mudflats.</p><p>“We were interested in examining one of the most important evolutionary events in our history as animals: the transition from living in water to living on land,” said Richard Blob, alumni distinguished professor of biological sciences at Clemson University. “Because of the focus on limbs, the role of the tail may not have been considered very strongly in the past. In some ways, it was hiding in plain sight. Some of the features that the animals used were new, such as limbs, but some of them were existing features that they simply co-opted to allow them to move into a new habitat.”</p><p>With Ph.D. student Sandy Kawano, now a researcher at the National Institute for &nbsp;Mathematical and Biological Synthesis, Blob’s lab recorded how the mudskippers (<em>Periopthalmus barbaratus</em>) moved on a variety of loose surfaces, providing data and video to Goldman’s laboratory. The small fish, which uses its front fins and tail to move on land, lives in tidal areas near shore, spending time in the water and on sandy and muddy surfaces.</p><p>Benjamin McInroe was a Georgia Tech undergraduate when he analyzed the mudskipper data provided by the Clemson team. He applied the principles to a robot model known as MuddyBot that has two limbs and a powerful tail, with motion provided by electric motors. Information from both the mudskipper and robotic studies were also factored into a mathematical model provided by researchers at Carnegie Mellon University.</p><p>“We used three complementary approaches,” said McInroe, who is a now a Ph.D. student at the University of California Berkeley. “The fish provided a morphological, functional model of these early walkers. With the robot, we are able to simplify the complexity of the mudskipper and by varying the parameters, understand the physical mechanisms of what was happening. With the mathematical model and its simulations, we were able to understand the physics behind what was going on.”</p><p>Both the mudskippers and the robot moved by lifting themselves up to reduce drag on their bodies, and both needed a kick from their tails to climb 20-degree sandy slopes. Using their “fins” alone, both struggled to climb slopes and often slid backward if they didn’t use their tails, McInroe noted. Early land animals likely didn’t have precise control over their limbs, and the tail may have compensated for that limitation, helping the animals ascend sandy slopes.</p><p>The Carnegie Mellon University researchers, who have worked with Goldman on relating the locomotion of other animals to robots, demonstrated that theoretical models developed to describe the complex motion of robots can also be used to understand locomotion in the natural world.</p><p>“Our computer modeling tools allow us to visualize, and therefore better understand, how the mudskipper incorporates its tail and flipper motions to locomote,” said Howie Choset, a professor in the Robotics Institute at Carnegie Mellon University. “This work also will advance robotics in those cases where a robot needs to surmount challenging terrains with various inclinations.”</p><p>The model was based on a framework proposed to broadly understand locomotion by physicist Frank Wilczek – a Nobel Prize winner – and his then student Alfred Shapere in the 1980s. The so-called “geometric mechanics” approach to locomotion of human-made devices (like satellites) was largely developed by engineers, including those in Choset’s group. To provide force relationships as inputs to the mudskipper robot model, Georgia Tech postdoctoral fellow Jennifer Rieser and Georgia Tech graduate student Perrin Schiebel measured drag in inclined granular materials.</p><p>Information from the study could help in the design of robots that may need to move on surfaces such as sand that flows around limbs, said Goldman. Such flow of the substrate can impede motion, depending on the shape of the appendage entering the sand and the type of motion.</p><p>But the study’s most significant impact may be to provide new insights into how vertebrates made the transition from water to land.</p><p>“We want to ultimately know how natural selection can act to modify structures already present in organisms to allow for locomotion in a fundamentally different environment,” Goldman said. “Swimming and walking on land are fundamentally different, yet these early animals had to make the transition.”</p><p>The project also represents a combination of physics, biology and engineering.</p><p>“Professor Goldman and his collaborators are combining physics and engineering prototyping approaches to understand the physical principles that allow animals to move in different environments,” said Krastan Blagoev, program director in the National Science Foundation’s Division of Physics. “This novel approach to living organisms promises to bring to biological sciences higher predictive power and at the same time uncover engineering principles that we have never imagined before.”</p><p>In addition to those already mentioned, the project also included co-first author Henry Astley, a Georgia Tech postdoctoral researcher when the project was done, and Chaohui Gong, a postdoctoral researcher at Carnegie Mellon University.</p><p><em>This research was supported by the National Science Foundation and the NSF Physics of Living Systems program through grants PHY-1205878, PHY-1150760, CMMI-1361778; the Army Research Office through grant W911NF-11-1-0514, and the Army Research Laboratory MAST CTA program. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation, the Army Research Office or the Army Research Laboratory. The Robotics Collaborative Technology Alliance also supported this work.</em></p><p><strong>CITATION</strong>: Benjamin McInroe, et al., “Tail use improves soft substrate performance in models of early vertebrate land locomotors,” (Science, 2016).</p><p><strong>Research News</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>177 North Avenue</strong><br /><strong>Atlanta, Georgia 30332-0181 USA</strong></p><p><strong>Media Contacts</strong>: John Toon (404-894-6986) (<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>) or Ben Brumfield (404-385-1933) (<a href="mailto:ben.brumfield@comm.gatech.edu">ben.brumfield@comm.gatech.edu</a>).</p><p><strong>Writer</strong>: John Toon</p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1467631167</created>  <gmt_created>2016-07-04 11:19:27</gmt_created>  <changed>1475896924</changed>  <gmt_changed>2016-10-08 03:22:04</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[A new study used a robot to help understand how the first land animals moved about.]]></teaser>  <type>news</type>  <sentence><![CDATA[A new study used a robot to help understand how the first land animals moved about.]]></sentence>  <summary><![CDATA[<p>When early terrestrial animals began moving about on mud and sand 360 million years ago, the powerful tails they used as fish may have been more important than scientists previously realized. That’s one conclusion from a new study of African mudskipper fish and a robot modeled on the animal.</p>]]></summary>  <dateline>2016-07-07T00:00:00-04:00</dateline>  <iso_dateline>2016-07-07T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-07-07 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>550231</item>          <item>550261</item>          <item>550271</item>          <item>550331</item>          <item>550291</item>          <item>550311</item>          <item>550351</item>      </media>  <hg_media>          <item>          <nid>550231</nid>          <type>image</type>          <title><![CDATA[Mudskipper]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[mudskipper10.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/mudskipper10.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/mudskipper10.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/mudskipper10.jpg?itok=ZVnzTdQ5]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mudskipper]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550261</nid>          <type>image</type>          <title><![CDATA[MuddyBot robot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[terrestrial-animals7.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/terrestrial-animals7_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/terrestrial-animals7_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/terrestrial-animals7_0.jpg?itok=otWbE-Gu]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MuddyBot robot]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550271</nid>          <type>image</type>          <title><![CDATA[Dan Goldman and MuddyBot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[muddybot-36.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/muddybot-36.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/muddybot-36.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/muddybot-36.jpg?itok=07xUBdZ1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Dan Goldman and MuddyBot]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550331</nid>          <type>image</type>          <title><![CDATA[MuddyBot in trackway]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[terrestrial-animals8.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/terrestrial-animals8.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/terrestrial-animals8.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/terrestrial-animals8.jpg?itok=fnQoChPl]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MuddyBot in trackway]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550291</nid>          <type>image</type>          <title><![CDATA[Researchers and MuddyBot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[terrestrial-animals6.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/terrestrial-animals6_2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/terrestrial-animals6_2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/terrestrial-animals6_2.jpg?itok=qILNvOTR]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researchers and MuddyBot]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550311</nid>          <type>image</type>          <title><![CDATA[Researchers and MuddyBot2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[terrestrial-animals5.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/terrestrial-animals5.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/terrestrial-animals5.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/terrestrial-animals5.jpg?itok=feMjnVsV]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researchers and MuddyBot2]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>          <item>          <nid>550351</nid>          <type>image</type>          <title><![CDATA[Mudskipper2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[mudskipper9.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/mudskipper9.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/mudskipper9.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/mudskipper9.jpg?itok=KsMzVeSH]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mudskipper2]]></image_alt>                    <created>1467727200</created>          <gmt_created>2016-07-05 14:00:00</gmt_created>          <changed>1475895345</changed>          <gmt_changed>2016-10-08 02:55:45</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="150"><![CDATA[Physics and Physical Sciences]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="47881"><![CDATA[Dan Goldman]]></keyword>          <keyword tid="144361"><![CDATA[granular surface]]></keyword>          <keyword tid="170448"><![CDATA[MuddyBot]]></keyword>          <keyword tid="170449"><![CDATA[mudskipper]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="166937"><![CDATA[School of Physics]]></keyword>          <keyword tid="170451"><![CDATA[terrestrial animal]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="553741">  <title><![CDATA[IRIM&#039;s Henrik Christensen Steps Down: Accepts Lead Position at UC San Diego&#039;s Contextual Robotics Institute]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1468846437</created>  <gmt_created>2016-07-18 12:53:57</gmt_created>  <changed>1475893696</changed>  <gmt_changed>2016-10-08 02:28:16</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[The Mold That Changed the World musical]]></publication>  <article_dateline>2016-07-07T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-07-07T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-07-07T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://jacobsschool.ucsd.edu/news/news_releases/release.sfe?id=1977]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="78861"><![CDATA[Henrik I. Christensen]]></keyword>          <keyword tid="98411"><![CDATA[Insitute for Robotics and Intelligent Machines]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="573351">  <title><![CDATA[How Tails May Have Helped Ancient Animals Make the Transition from Water to Land]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1473176352</created>  <gmt_created>2016-09-06 15:39:12</gmt_created>  <changed>1475893702</changed>  <gmt_changed>2016-10-08 02:28:22</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[partner institutions]]></publication>  <article_dateline>2016-07-08T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-07-08T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-07-08T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.latimes.com/science/sciencenow/la-sci-sn-tails-water-land-20160708-snap-story.html]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>      </categories>  <keywords>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="552291">  <title><![CDATA[Robot Earns its Shoes, Walks Like a Person]]></title>  <uid>27560</uid>  <body><![CDATA[<p>What do you give a robot when it takes its first steps like a human? Its first pair of shoes.</p><p>Georgia Institute of Technology researchers have created what they say is the most efficient-walking humanoid ever created. While most machines these days are hunched at the waist and plod along on flat feet, <a href="https://youtu.be/1fC7b2LjVW4">Georgia Tech’s DURUS strolls like a person</a>. Its legs and chest are elongated and upright. It lands on the heel of its foot, rolls through the step and pushes off its toe. It’s even outfitted with a pair of size-13 shoes as it walks under its own power on a treadmill in the team’s <a href="http://www.bipedalrobotics.com/">AMBER Lab</a>.</p><p>“Our robot is able to take much longer, faster steps than its flat-footed counterparts because it’s replicating human locomotion,” said Aaron Ames, director of the Georgia Tech lab and a professor in the George W. Woodruff School of Mechanical Engineering and School of Electrical and Computer Engineering. “Multi-contact foot behavior also allows it to be more dynamic, pushing us closer to our goal of allowing the robot to walk outside in the real world.”</p><p>As Ames tells it, the traditional approach to creating a robotic walker is similar to an upside-down pendulum. Researchers typically use comparatively simple algorithms to move the top of the machine forward while keeping its feet flat and grounded. As it shuffles along, the waist stays at a constant height, creating the distinctive hunched look. This not only prevents these robots from moving with the dynamic grace present in human walking, but also prevents them from efficiently propelling themselves forward.</p><p>The Georgia Tech humanoid <a href="https://www.youtube.com/watch?v=a-R4H8-8074">walked with flat feet until about a week ago</a>, although it was powered by fundamentally different algorithms than most robots. To demonstrate the power of those methods, Ames and his team of student researchers built a pair of metal feet with arched soles. They applied their complex mathematical formulas, but watched DURUS misstep and fall for three days. The team continued to tweak the algorithms and, on the fourth day, the robot got it. &nbsp;The machine walked dynamically on its new feet, displaying the heel-strike and toe push-off that is a key feature of human walking. The robot is further equipped with springs between its ankles and feet, similar to elastic tendons in people, allowing for a walking gait that stores mechanical energy from a heel strike to be later reclaimed as the foot lifts off the ground.</p><p>This natural gait makes DURUS very efficient. Robot locomotion efficiency is universally measured by a “cost of transport,” or the amount of power it uses divided by the machine’s weight and walking speed. Ames says the best humanoids are approximately 3.0. Georgia Tech’s cost of transport is 1.4, all while being self-powered: it’s not tethered by a power cord from an external source.</p><p>This new level of efficiency is achieved in no small part through human-like foot behavior. DURUS had earned its new pair of shoes.&nbsp;</p><p>“Flat-footed robots demonstrated that walking was possible,” said Ames. “But they’re a starting point, like a propeller-powered airplane. It gets the job done, but it’s not a jet engine. We want to build something better, something that can walk up and down stairs or run across a field.”</p><p>He adds these advances have the potential to usher in the next generation of robotic assistive devices like prostheses and exoskeletons that can enable the mobility-impaired to walk with ease.</p><p>The student team was led by graduate student Jake Reher. The shoes were created by another graduate student, Eric Ambrose. DURUS was designed in collaboration with the robotics division of SRI International.&nbsp;</p><p><em>The project is supported by the </em><em>National Science Foundation under Grant No. 1526519</em><em>. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.<br /><br /></em>More photos: https://www.dropbox.com/sh/bsu2neu9qnwnuz1/AACyP8EMiZrwRKt0yzasez9ma?dl=0<em><br /></em></p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1468329414</created>  <gmt_created>2016-07-12 13:16:54</gmt_created>  <changed>1475896924</changed>  <gmt_changed>2016-10-08 03:22:04</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Self-powered robot walks like a human and wears shoes.]]></teaser>  <type>news</type>  <sentence><![CDATA[Self-powered robot walks like a human and wears shoes.]]></sentence>  <summary><![CDATA[<p>Researchers have created what they say is the most efficient-walking humanoid ever. While most machines these days are hunched at the waist and plod along on flat feet, DURUS strolls like a person. It lands on the heel of its foot, rolls through the step and pushes off its toe. It’s even outfitted with a pair of size-13 shoes.</p>]]></summary>  <dateline>2016-07-12T00:00:00-04:00</dateline>  <iso_dateline>2016-07-12T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-07-12 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />National Media Relations<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-660-2926</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>552251</item>          <item>552281</item>          <item>552261</item>          <item>552271</item>      </media>  <hg_media>          <item>          <nid>552251</nid>          <type>image</type>          <title><![CDATA[Stepping with shoe]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[stepping_with_shoe.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/stepping_with_shoe.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/stepping_with_shoe.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/stepping_with_shoe.jpg?itok=srKRZTBf]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Stepping with shoe]]></image_alt>                    <created>1468346400</created>          <gmt_created>2016-07-12 18:00:00</gmt_created>          <changed>1475895348</changed>          <gmt_changed>2016-10-08 02:55:48</gmt_changed>      </item>          <item>          <nid>552281</nid>          <type>image</type>          <title><![CDATA[DURUS feet]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[feet.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/feet.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/feet.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/feet.jpg?itok=u5lHNss-]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[DURUS feet]]></image_alt>                    <created>1468346400</created>          <gmt_created>2016-07-12 18:00:00</gmt_created>          <changed>1475895348</changed>          <gmt_changed>2016-10-08 02:55:48</gmt_changed>      </item>          <item>          <nid>552261</nid>          <type>image</type>          <title><![CDATA[DURUS]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[durus_with_shoes.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/durus_with_shoes.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/durus_with_shoes.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/durus_with_shoes.jpg?itok=93bWx1-T]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[DURUS]]></image_alt>                    <created>1468346400</created>          <gmt_created>2016-07-12 18:00:00</gmt_created>          <changed>1475895348</changed>          <gmt_changed>2016-10-08 02:55:48</gmt_changed>      </item>          <item>          <nid>552271</nid>          <type>image</type>          <title><![CDATA[Shoe and metal foot]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[shoe_and_foot_1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/shoe_and_foot_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/shoe_and_foot_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/shoe_and_foot_1.jpg?itok=ZBLMESyp]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Shoe and metal foot]]></image_alt>                    <created>1468346400</created>          <gmt_created>2016-07-12 18:00:00</gmt_created>          <changed>1475895348</changed>          <gmt_changed>2016-10-08 02:55:48</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.bipedalrobotics.com/]]></url>        <title><![CDATA[AMBER Lab]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1183"><![CDATA[Home]]></group>      </groups>  <categories>      </categories>  <news_terms>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="555381">  <title><![CDATA[IRIM Launches Vintage Robotics Research Video Series]]></title>  <uid>27255</uid>  <body><![CDATA[<p class="p1">The <a href="http://robotics.gatech.edu/" target="_blank">Institute for Robotics and Intelligent Machines (IRIM)</a> at Georgia Tech has launched a new series of robotics research videos on YouTube.&nbsp;Donated by <a href="http://www.cc.gatech.edu/people/clinton-w-kelly-iii" target="_blank">Dr. Clinton W. Kelly III</a>, a member of the College of Computing’s advisory board and a longtime benefactor of Georgia Tech,&nbsp;the diverse collection of videos&nbsp;covers an extended period of time and offers a behind-the-scenes look at different aspects of the earlier days of unmanned vehicle and other robotics research conducted across multiple institutions, companies, and funding agencies.</p><p class="p1">Kelly’s distinguished career includes serving as&nbsp;director of the U.S. Strategic Computing Program and executive director of the Information Science and Technology Office&nbsp;at&nbsp;<a href="http://www.darpa.mil/" target="_blank">DARPA</a>&nbsp;(the Defense Advanced Research Projects Agency). Subsequently, he served as the CTO for&nbsp;<a href="http://www.saic.com/" target="_blank">SAIC</a>&nbsp;(Science Applications International Corporation). He has also held a number of advisory positions with universities, including Duke, CMU, and Georgia Tech.</p><p class="p1">The <a href="https://youtu.be/mG_ZKXo6Rlg" target="_blank">first video in the series</a> features the Leg Lab, established at CMU by Marc Raibert and later moved to MIT. The Leg Lab developed robots that ran and maneuvered like animals and formed the basis for the company <a href="http://www.bostondynamics.com/bd_about.html" target="_blank">Boston Dynamics</a>, which was founded by Raibert in 1992. The video shows some of the early examples of legged robots from monopods to quadropeds.</p><p class="p1">Through a collaborative effort, led by Henrik I. Christensen, IRIM’s founding&nbsp;executive&nbsp;director, and Josie Giles, marketing and communication manager, IRIM will regularly publish additional video clips from the collection on YouTube to offer the general public a unique insight into the foundational work on robotics technologies.&nbsp;</p><p class="p1"><strong>Background</strong></p><p class="p1">For additional information about the relationship between Kelly and IRIM, visit&nbsp;<a href="http://robotics.gatech.edu/outreach/kellylecture" target="_blank">http://robotics.gatech.edu/outreach/kellylecture</a>.&nbsp;</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1469472021</created>  <gmt_created>2016-07-25 18:40:21</gmt_created>  <changed>1475896928</changed>  <gmt_changed>2016-10-08 03:22:08</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[The videos cover an extended period of time and offer a behind-the-scenes look at different aspects of the earlier days of unmanned vehicle and other robotics research conducted across multiple institutions, companies, and funding agencies.]]></teaser>  <type>news</type>  <sentence><![CDATA[The videos cover an extended period of time and offer a behind-the-scenes look at different aspects of the earlier days of unmanned vehicle and other robotics research conducted across multiple institutions, companies, and funding agencies.]]></sentence>  <summary><![CDATA[<p>The videos cover an extended period of time and offer a behind-the-scenes look at different aspects of the earlier days of unmanned vehicle and other robotics&nbsp;research&nbsp;conducted across multiple institutions, companies, and funding agencies.</p>]]></summary>  <dateline>2016-07-25T00:00:00-04:00</dateline>  <iso_dateline>2016-07-25T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-07-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Josie Giles<br />IRIM Marketing Communications Mgr.<br /><a href="mailto:josie@gatech.edu">josie@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>      </media>  <hg_media>      </hg_media>  <related>          <link>        <url><![CDATA[https://youtu.be/mG_ZKXo6Rlg]]></url>        <title><![CDATA[Leg Lab Video]]></title>      </link>          <link>        <url><![CDATA[http://robotics.gatech.edu/]]></url>        <title><![CDATA[Center for Robotics & Intelligent Machines]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/people/clinton-w-kelly-iii]]></url>        <title><![CDATA[Clinton W. Kelly, III]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="115691"><![CDATA[Clinton Kelly]]></keyword>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="556631">  <title><![CDATA[Egerstedt Named New Executive Dir. of the Institute for Robotics and Intelligent Machines]]></title>  <uid>27255</uid>  <body><![CDATA[<p class="intro-text">The Georgia Institute of Technology today announced the appointment of Magnus Egerstedt as the new executive director of the Institute for Robotics and Intelligent Machines (IRIM).</p><p>Egerstedt is the Schlumberger Professor in the College of Engineering’s <a href="https://www.ece.gatech.edu/">School of Electrical and Computer Engineering</a>, where he serves as associate chair for Research and External Affairs. He has been a member of the faculty since 2001, and leads the Georgia Robotics and Intelligent Systems Laboratory (GRITS Lab), which focuses research on control and coordination of complex networks such as multi-robot systems, mobile-sensor networks, and cyber-physical systems.</p><p>“We are excited to welcome Magnus into the role of executive director for IRIM.&nbsp;His enthusiasm and vision, coupled with his leadership ability and passion to support faculty, make him the obvious choice to serve in the role,” said Stephen E. Cross, executive vice president for Research. “His work over the years has served to enhance Tech’s reputation as a thought leader in the robotics and intelligent machines space.&nbsp;I look forward to working with him.”</p><p>“I am honored to have the opportunity to serve as the next executive director of IRIM,” Egerstedt said. “IRIM has a bright future as a research leader and industry partner, and I am looking forward to charting the course for the next phase of robotics at Georgia Tech.”</p><p>Egerstedt emerged as the top candidate from an internal search to replace the founding executive director of IRIM, <a href="http://robotics.gatech.edu/team/faculty/christensen">Henrik I. Christensen</a>, who is moving to the University of California, San Diego to serve as director of the newly formed Contextual Robotics Institute&nbsp;and professor in the Department of Computer Science and Engineering at the Jacobs School of Engineering.</p><p>“I am very pleased to learn the next IRIM executive director will be Magnus,” Christensen said. “I have known him since we were both in Stockholm. He is an excellent researcher and a highly respected member of the research community. He has all the right skills to secure the continued growth of the robotics efforts at Georgia Tech.”</p><p>Egerstedt will begin his duties at IRIM on Aug. 1, and will continue to lead the GRITS Lab in the School of Electrical and Computer Engineering.</p><p>He holds a bachelor’s degree in philosophy from Stockholm University and master’s and doctoral degrees in engineering physics and applied mathematics, respectively, from the Royal Institute of Technology, Stockholm. After completing his Ph.D., Egerstedt was a postdoctoral scholar at Harvard University.</p><p>“Henrik helped build a world-class robotics program at Tech. Magnus has both the leadership skills and outstanding research in this area to continue IRIM’s growth and impact on the robotics field,” said Gary S. May, Southern Company Chair and dean of the College of Engineering.</p><p>Launched as an Interdisciplinary Research Institute in fall 2013, the&nbsp;Institute for Robotics and Intelligent Machines was built upon foundational work developed over the previous seven years in the former Robotics &amp; Intelligent Machines Center at Georgia Tech.</p><p>IRIM brings together&nbsp;robotics researchers from across campus—spanning colleges, departments,&nbsp;and individual labs—to create new collaborative opportunities for faculty, strengthen partnerships with industry and government,&nbsp;and maximize the societal impact of the transformative robotics research conducted at Georgia Tech.</p><p>Learn more at <a href="http://www.robotics.gatech.edu">www.robotics.gatech.edu</a>.</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1469792175</created>  <gmt_created>2016-07-29 11:36:15</gmt_created>  <changed>1475896932</changed>  <gmt_changed>2016-10-08 03:22:12</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Egerstedt will replace the founding executive director of IRIM, Henrik I. Christensen, who is moving to the University of California, San Diego.]]></teaser>  <type>news</type>  <sentence><![CDATA[Egerstedt will replace the founding executive director of IRIM, Henrik I. Christensen, who is moving to the University of California, San Diego.]]></sentence>  <summary><![CDATA[<p class="intro-text">The Georgia Institute of Technology today announced the appointment of Magnus Egerstedt as the new executive director of the Institute for Robotics and Intelligent Machines (IRIM).&nbsp;</p>]]></summary>  <dateline>2016-07-29T00:00:00-04:00</dateline>  <iso_dateline>2016-07-29T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-07-29 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Josie Giles<br />IRIM Marketing Communications Mgr.<br /><a href="mailto:josie@gatech.edu">josie@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>556671</item>          <item>224041</item>          <item>348951</item>      </media>  <hg_media>          <item>          <nid>556671</nid>          <type>image</type>          <title><![CDATA[Magnus Egerstedt Again]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[magnus_with_robot_2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/magnus_with_robot_2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/magnus_with_robot_2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/magnus_with_robot_2.jpg?itok=p7YfFQ59]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Magnus Egerstedt Again]]></image_alt>                    <created>1469808608</created>          <gmt_created>2016-07-29 16:10:08</gmt_created>          <changed>1475895355</changed>          <gmt_changed>2016-10-08 02:55:55</gmt_changed>      </item>          <item>          <nid>224041</nid>          <type>image</type>          <title><![CDATA[Magnus Egerstedt]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[egerstedtheadshot.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/egerstedtheadshot_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/egerstedtheadshot_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/egerstedtheadshot_0.jpg?itok=tR3LDBqY]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Magnus Egerstedt]]></image_alt>                    <created>1449243551</created>          <gmt_created>2015-12-04 15:39:11</gmt_created>          <changed>1475894896</changed>          <gmt_changed>2016-10-08 02:48:16</gmt_changed>      </item>          <item>          <nid>348951</nid>          <type>image</type>          <title><![CDATA[Swarm robotics - Magnus Egerstedt]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[swarm-robots-cover.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/swarm-robots-cover_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/swarm-robots-cover_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/swarm-robots-cover_0.jpg?itok=dxI75V1J]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Swarm robotics - Magnus Egerstedt]]></image_alt>                    <created>1449245682</created>          <gmt_created>2015-12-04 16:14:42</gmt_created>          <changed>1475895073</changed>          <gmt_changed>2016-10-08 02:51:13</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://robotics.gatech.edu/]]></url>        <title><![CDATA[Center for Robotics & Intelligent Machines]]></title>      </link>          <link>        <url><![CDATA[https://www.ece.gatech.edu/faculty-staff-directory/magnus-egerstedt-0]]></url>        <title><![CDATA[Magnus Egerstedt]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="132"><![CDATA[Institute Leadership]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="132"><![CDATA[Institute Leadership]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="78861"><![CDATA[Henrik I. Christensen]]></keyword>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>          <keyword tid="11528"><![CDATA[Magnus Egerstedt]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="573361">  <title><![CDATA[Ueda Contributes to Development of Vibrating Device for Surgeons]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1473178890</created>  <gmt_created>2016-09-06 16:21:30</gmt_created>  <changed>1475893702</changed>  <gmt_changed>2016-10-08 02:28:22</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Twisted Taco]]></publication>  <article_dateline>2016-08-01T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-08-01T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-08-01T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[https://www.sciencedaily.com/releases/2016/08/160801092956.htm]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>          <keyword tid="13887"><![CDATA[Jun Ueda]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="581810">  <title><![CDATA[Georgia Tech Bids Farewell to Henrik Christensen]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1475073937</created>  <gmt_created>2016-09-28 14:45:37</gmt_created>  <changed>1475073937</changed>  <gmt_changed>2016-09-28 14:45:37</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Georgia Tech Bids Farewell to Henrik Christensen]]></publication>  <article_dateline>2016-08-25T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-08-25T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-08-25T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.cc.gatech.edu/georgia-tech-bids-farewell-henrik-christensen]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="11890"><![CDATA[henrik christensen]]></keyword>          <keyword tid="78811"><![CDATA[Institute for Robotics and Intelligent Machines]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="571441">  <title><![CDATA[Chernova Earns NASA Early Career Faculty Award]]></title>  <uid>27255</uid>  <body><![CDATA[<p>The Institute for Robotics and Intelligent Machine’s <a href="http://robotics.gatech.edu/team/faculty/chernova">Sonia Chernova</a> is one of eight university researchers nationwide selected by NASA to receive the 2016&nbsp;<a href="http://www.nasa.gov/press/2014/july/nasa-announces-early-career-faculty-space-tech-research-grants/">Early Career Faculty Award</a> (ECF) in the Space Technology Research Grants program.</p><p>The grants, worth up to $200,000 per year over three years, are awarded to outstanding early career faculty focused on space technology development addressing critical needs in the U.S. space program.</p><p>Chernova,&nbsp;an assistant professor of Interactive Computing in the College of Computing at Georgia Tech, received the award for her proposal to develop interactive robotic systems that enable co-located astronauts and Earth-based NASA operators to refine and adapt the behavior of a robot during deployment in a way that maximizes task efficiency and human-robot team operation.</p><p>As NASA prepares future crewed low-Earth orbit, lunar, and Mars-based deployments, the award provides Chernova with funding to develop techniques for execution of repetitive, routine, and potentially hazardous tasks by robots.&nbsp;</p><p>“I hope that my research will enable a robot to detect when unexpected operating conditions are encountered, request help from co-located crew members or remote ground control operators, as appropriate, and refine its operating procedures to improve future task execution and the long-term autonomy of the system,” Chernova said.</p><p>Current practices in deploying robotic space systems are limited to manual teleoperation of robots by crews in co-located settings, and the use of carefully handcrafted structured control sequences from ground control. Both approaches are costly in terms of crew time and effort and are not scalable for long-term, co-robot deployments.&nbsp;By enabling the robots to leverage the inputs obtained from human operators, Chernova’s work aims to facilitate the automation of many routine tasks, thereby reducing the&nbsp;load on human crew members and supporting safer, more affordable, and more effective human-robot space exploration and discovery.</p><p>NASA’s Early Career Faculty Award is administered by the agency’s Space Technology Research Grants Program, which seeks to accelerate the development of emerging technologies from academia that serve the needs of NASA, other government agencies, and space-related industries.</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1472660431</created>  <gmt_created>2016-08-31 16:20:31</gmt_created>  <changed>1653584976</changed>  <gmt_changed>2022-05-26 17:09:36</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Chernova’s grant, worth up to $200,000 per year over three years, is awarded to outstanding early career faculty focused on space technology development addressing critical needs in the U.S. space program.]]></teaser>  <type>news</type>  <sentence><![CDATA[Chernova’s grant, worth up to $200,000 per year over three years, is awarded to outstanding early career faculty focused on space technology development addressing critical needs in the U.S. space program.]]></sentence>  <summary><![CDATA[<p>The Institute for Robotics and Intelligent Machine’s <a href="http://robotics.gatech.edu/team/faculty/chernova">Sonia Chernova</a> is one of eight university researchers nationwide selected by NASA to receive the 2016&nbsp;<a href="http://www.nasa.gov/press/2014/july/nasa-announces-early-career-faculty-space-tech-research-grants/">Early Career Faculty Award</a> (ECF) in the Space Technology Research Grants program.</p>]]></summary>  <dateline>2016-08-31T00:00:00-04:00</dateline>  <iso_dateline>2016-08-31T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-08-31 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[josie@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Josie Giles - IRIM Marketing Communications Mgr.</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>571471</item>      </media>  <hg_media>          <item>          <nid>571471</nid>          <type>image</type>          <title><![CDATA[Sonia Chernova]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[sonia_chernova.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/sonia_chernova_1.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/sonia_chernova_1.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/sonia_chernova_1.jpg?itok=kzB30FnA]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sonia Chernova]]></image_alt>                    <created>1472676847</created>          <gmt_created>2016-08-31 20:54:07</gmt_created>          <changed>1475895379</changed>          <gmt_changed>2016-10-08 02:56:19</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[https://www.nasa.gov/directorates/spacetech/strg/ecf2016.html]]></url>        <title><![CDATA[NASA Press Release]]></title>      </link>          <link>        <url><![CDATA[http://robotics.gatech.edu/team/faculty/chernova]]></url>        <title><![CDATA[Sonia Chernova]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>      </groups>  <categories>          <category tid="8862"><![CDATA[Student Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="8862"><![CDATA[Student Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>          <keyword tid="172312"><![CDATA[NASA Early Career Faculty Award]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="169047"><![CDATA[Sonia Chernova]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="573271">  <title><![CDATA[Team of Robots Learns to Work Together, Without Colliding]]></title>  <uid>27560</uid>  <body><![CDATA[<p>When roboticists create behaviors for teams of robots, they first build algorithms that focus on the intended task. Then they wrap safety behaviors around those primary algorithms to keep the machines from running into each other. Each robot is essentially given an invisible bubble that other robots must stay away from. As long as nothing touches the bubble, the robots move around without any issues. But that’s where the problems begin.</p><p>“When you have too many robots together, they get so focused on not colliding with each other that they eventually just stop moving,” said Georgia Tech roboticist <a href="http://gritslab.gatech.edu/home/">Magnus Egerstedt</a>, director of Georgia Tech’s <a href="http://www.robotics.gatech.edu/">Institute of Robotics and Intelligent Machines</a>. “Their safety behaviors take over and the robots freeze. It’s impossible for them to go anywhere because any movement would cause their bubbles to pop.”</p><p>Egerstedt has created a solution. His team’s new algorithms allow any number of robots to move within inches of each other, without colliding, to complete their task — swapping locations on his lab floor. They are the first researchers to create such minimally invasive safety algorithms.</p><p>In technical speak, the bots are using a set of safe states and barrier certificates to ensure each stays in its own safe set throughout the entire maneuver.</p><p>“In everyday speak, we’ve shrunk the size of each robot’s bubble to make it as small as possible,” said Egerstedt, who is also the Julian T. Hightower Chair in the School of Electrical and Computer Engineering. “Our system allows the robots to make the minimum amount of changes to their original behaviors in order to accomplish the task and not smack into each other.”</p><p><a href="https://www.youtube.com/watch?v=2uujKTU0TYE">In a demo with four robots</a>, the lab’s machines approach from four different areas, meet in the middle, circle counterclockwise within inches of each other, then fan out into opposite directions. In another demonstration, eight robots perform the same task, this time circling clockwise before dispersing. Instead of keeping their distance and taking the long way around their neighbors, the robots move very independently wherever they wish.</p><p>Avoiding collisions isn’t anything new in robotics. And Google’s self-driving cars are almost crash-free.</p><p>“But we haven’t seen thousands of autonomous cars on the road together yet,” Egerstedt said.</p><p>“Robots are very conservative — they want to make sure they’re safe. You couldn’t pack the interstate with self-driving cars with today’s technology.”</p><p>Egerstedt also said something similar to these algorithms could be used for the next generation of air traffic control. Instead of people directing the flow, planes will be given the authority in airspaces.</p><p>“They’ll have to be safer if we plan to pack the airspace more densely.”</p><p>The paper about the project, “Multi-objective Compositions for Collision-Free Connectivity Maintenance in Teams of Mobile Robots,” has been accepted at this year’s <a href="http://cdc2016.ieeecss.org/">IEEE Conference on Decision and Control</a> in Las Vegas.</p><p><em>The work is supported in part by the National Science Foundation (grant numbers 1544332 and 1239055). </em><em>Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the sponsors. </em></p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1473169039</created>  <gmt_created>2016-09-06 13:37:19</gmt_created>  <changed>1653584976</changed>  <gmt_changed>2022-05-26 17:09:36</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers shrink size of robotic "bubbles" to avoid collisions of teams of machines.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers shrink size of robotic "bubbles" to avoid collisions of teams of machines.]]></sentence>  <summary><![CDATA[<p>New algorithms allow any number of robots to move within inches of each other, without colliding, to complete their task — swapping locations on his lab floor. They are the first researchers to create such minimally invasive safety algorithms.</p>]]></summary>  <dateline>2016-09-06T00:00:00-04:00</dateline>  <iso_dateline>2016-09-06T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-09-06 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer - National Media Relations</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>556671</item>      </media>  <hg_media>          <item>          <nid>556671</nid>          <type>image</type>          <title><![CDATA[Magnus Egerstedt Again]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[magnus_with_robot_2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/magnus_with_robot_2.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/magnus_with_robot_2.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/magnus_with_robot_2.jpg?itok=p7YfFQ59]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Magnus Egerstedt Again]]></image_alt>                    <created>1469808608</created>          <gmt_created>2016-07-29 16:10:08</gmt_created>          <changed>1475895355</changed>          <gmt_changed>2016-10-08 02:55:55</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://gritslab.gatech.edu/home/]]></url>        <title><![CDATA[GRITS Lab]]></title>      </link>          <link>        <url><![CDATA[https://www.ece.gatech.edu/]]></url>        <title><![CDATA[School of Electrical and Computer Engineering]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1183"><![CDATA[Home]]></group>      </groups>  <categories>      </categories>  <news_terms>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="581855">  <title><![CDATA[IRIM Announces Multiple Faculty Positions Available at Georgia Tech]]></title>  <uid>27255</uid>  <body><![CDATA[<p>Robotics &mdash; as interpreted broadly &mdash; is of strategic importance to the Georgia Institute of Technology, and we are continuing to grow our faculty in this dynamic area.</p><p>This year we have a number of faculty openings across campus, including in the School of Interactive Computing (<a href="http://www.ic.gatech.edu/about/faculty-hiring" target="_blank">http://www.ic.gatech.edu/about/faculty-hiring</a>); the School of Electrical and Computer Engineering (<a href="https://www.ece.gatech.edu/faculty-openings" target="_blank">https://www.ece.gatech.edu/faculty-openings</a>); the George W. Woodruff School of Mechanical Engineering (<a href="http://www.me.gatech.edu/about/employment" target="_blank">http://www.me.gatech.edu/about/employment</a>); the Wallace H. Coulter Department of Biomedical Engineering (job posting: <a href="https://academicjobsonline.org/ajo/jobs/7753" target="_blank">https://academicjobsonline.org/ajo/jobs/7753</a>); and the Daniel Guggenheim School of Aerospace Engineering (<a href="http://www.ae.gatech.edu/job-openings-aerospace-engineering" target="_blank">http://www.ae.gatech.edu/job-openings-aerospace-engineering</a>).</p><p>Interested candidates are encouraged to consult the individual job postings directly for more information.</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1475095723</created>  <gmt_created>2016-09-28 20:48:43</gmt_created>  <changed>1477253677</changed>  <gmt_changed>2016-10-23 20:14:37</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[IRIM announces multiple faculty positions available at Georgia Tech]]></teaser>  <type>news</type>  <sentence><![CDATA[IRIM announces multiple faculty positions available at Georgia Tech]]></sentence>  <summary><![CDATA[<p>Robotics &mdash; as interpreted broadly &mdash; is of strategic importance to the Georgia Institute of Technology, and we are continuing to grow our faculty in this dynamic area.</p><p>This year we have a number of faculty openings across campus, including in multiple units in the College of Engineering and the College of Computing&#39;s School of Interactive Computing.</p>]]></summary>  <dateline>2016-09-28T00:00:00-04:00</dateline>  <iso_dateline>2016-09-28T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-09-28 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[josie@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Please refer to the announcement for the appropriate link to follow for additional information.</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>581856</item>      </media>  <hg_media>          <item>          <nid>581856</nid>          <type>image</type>          <title><![CDATA[IRIM Faculty Positions]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[hiring-FB2-01.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/hiring-FB2-01.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/hiring-FB2-01.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/hiring-FB2-01.png?itok=mgJlhPJI]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Robotics at Georgia Tech Is Hiring]]></image_alt>                    <created>1475097607</created>          <gmt_created>2016-09-28 21:20:07</gmt_created>          <changed>1506457380</changed>          <gmt_changed>2017-09-26 20:23:00</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="134031"><![CDATA[faculty hiring]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="78271"><![CDATA[IRIM]]></keyword>          <keyword tid="78811"><![CDATA[Institute for Robotics and Intelligent Machines]]></keyword>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71871"><![CDATA[Campus and Community]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="582027">  <title><![CDATA[Larry Sweet Explores the Future of Collaborative Robots]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1475514382</created>  <gmt_created>2016-10-03 17:06:22</gmt_created>  <changed>1475590892</changed>  <gmt_changed>2016-10-04 14:21:32</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[5G+ technologies]]></publication>  <article_dateline>2016-10-03T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-10-03T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-10-03T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.mljournal-digital.com/meleadershipjournal/october_2016?pg=32#pg32]]></article_url>  <media>          <item><![CDATA[472601]]></item>      </media>  <hg_media>          <item>          <nid>472601</nid>          <type>image</type>          <title><![CDATA[Larry Sweet]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[l_sweet_photo.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/l_sweet_photo_0.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/l_sweet_photo_0.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/l_sweet_photo_0.jpg?itok=wF42zG9A]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Larry Sweet]]></image_alt>                              <created>1449257190</created>          <gmt_created>2015-12-04 19:26:30</gmt_created>          <changed>1475895223</changed>          <gmt_changed>2016-10-08 02:53:43</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="582658">  <title><![CDATA[Nancey Green Leigh Receives Grant to Study the U.S. Robotics Industry and Economic Impacts]]></title>  <uid>32550</uid>  <body><![CDATA[<p>College of Design Associate Dean for Research Nancey Green Leigh is the principal investigator of a new $784,887 grant from the National Science Foundation National Robotics Initiative to study the U.S. robotics industry and the economic impacts of robotics technology.<br /><br />Leigh, also a professor in the School of City and Regional Planning, is co-PI with Henrik Christensen, former director of Georgia Tech&rsquo;s Institute for Robotics and Intelligent Machines. He is now director of the Contextual Robotics Institute at the University of California, San Diego.<br /><br />The two-year grant will enable researchers to generate data and conduct analyses about the U.S. robotics industry and the economic impacts of robotics technology. The work will advance the understanding of the relationship between 21st-century technology and work, meeting a need to assess robots as more than just advanced manufacturing technology.<br /><br />According to Leigh, much of existing discussion on robots and industry has been speculative. The data that does exist ends at 2007.<br /><br />The project will have several components, but the researchers will start by surveying the manufacturing industry about its robot use and employment patterns, followed by a survey of systems integrators. They also will perform case studies with representatives from all stages of the robotic supply chain.<br /><br />In the end, this research is expected to inform policymakers, workers, and corporate leaders as they make decisions in anticipation of the use of robots throughout the economy. Employment structures, the changing nature of work, among other factors will be some day be impacted, the grant proposal states.</p><p>&nbsp;</p>]]></body>  <author>Malrey Head</author>  <status>1</status>  <created>1476722621</created>  <gmt_created>2016-10-17 16:43:41</gmt_created>  <changed>1477072330</changed>  <gmt_changed>2016-10-21 17:52:10</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Nancey Green Leigh is the principal investigator of a new $784,887 grant from the NSF National Robotics Initiative to study the U.S. robotics industry and the economic impacts of robotics technology.]]></teaser>  <type>news</type>  <sentence><![CDATA[Nancey Green Leigh is the principal investigator of a new $784,887 grant from the NSF National Robotics Initiative to study the U.S. robotics industry and the economic impacts of robotics technology.]]></sentence>  <summary><![CDATA[Nancey Green Leigh is the principal investigator of a new $784,887 grant from the NSF National Robotics Initiative to study the U.S. robotics industry and the economic impacts of robotics technology.]]></summary>  <dateline>2016-10-17T00:00:00-04:00</dateline>  <iso_dateline>2016-10-17T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-10-17 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Malrey Head<br />malrey.head@design.gatech.edu</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>59790</item>      </media>  <hg_media>          <item>          <nid>59790</nid>          <type>image</type>          <title><![CDATA[Nancey Green Leigh]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[Green_Leigh_Preferred.JPG]]></image_name>            <image_path><![CDATA[/sites/default/files/images/Green_Leigh_Preferred_0.JPG]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/Green_Leigh_Preferred_0.JPG]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/Green_Leigh_Preferred_0.JPG?itok=KB-h9HI7]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Nancey Green Leigh]]></image_alt>                    <created>1449176227</created>          <gmt_created>2015-12-03 20:57:07</gmt_created>          <changed>1475894398</changed>          <gmt_changed>2016-10-08 02:39:58</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1221"><![CDATA[College of Design]]></group>          <group id="1224"><![CDATA[School of City &amp; Regional Planning]]></group>          <group id="60380"><![CDATA[CSPAV - Center for Spatial Planning Analytics and Visualization]]></group>          <group id="60381"><![CDATA[CMT - Center for Music Technology]]></group>          <group id="1260"><![CDATA[CQGRD - Center for Quality Growth and Regional Development]]></group>          <group id="48996"><![CDATA[School of Architecture]]></group>          <group id="468131"><![CDATA[SimTigrate]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="623"><![CDATA[Technology]]></keyword>          <keyword tid="780"><![CDATA[employment]]></keyword>          <keyword tid="215"><![CDATA[manufacturing]]></keyword>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="582945">  <title><![CDATA[Do androids dream of eating sheep?]]></title>  <uid>27241</uid>  <body><![CDATA[<p>Do cyborgs really need to eat? Granted, there&rsquo;s no practical way to answer this question at the moment,&nbsp;<a href="http://www.hansonrobotics.com/robot-gallery/">but with robotics technology improving rapidly each year</a>, the question of fuel sources is a fascinating one. Is organic fuel for cyborgs a necessity? Couldn&rsquo;t they just power up throughout the day by plugging in a USB or a charger? Ayanna Howard was one of the scientists&nbsp;game enough&nbsp;to lend their knowledge and experience to speculate on this topic; she is the&nbsp;Linda J. and Mark C. Smith Chair Professor in the Georgia Tech School of Electrical and Computer Engineering.</p><p><a href="http://www.avclub.com/article/do-androids-dream-eating-sheep-243687">Read the article at the A.V. Club website</a>.</p>]]></body>  <author>Jackie Nemeth</author>  <status>1</status>  <created>1477076503</created>  <gmt_created>2016-10-21 19:01:43</gmt_created>  <changed>1477076544</changed>  <gmt_changed>2016-10-21 19:02:24</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Do cyborgs really need to eat? Granted, there’s no practical way to answer this question at the moment, but with robotics technology improving rapidly each year, the question of fuel sources is a fascinating one. Is organic fuel for cyborgs a necessity?]]></teaser>  <type>news</type>  <sentence><![CDATA[Do cyborgs really need to eat? Granted, there’s no practical way to answer this question at the moment, but with robotics technology improving rapidly each year, the question of fuel sources is a fascinating one. Is organic fuel for cyborgs a necessity?]]></sentence>  <summary><![CDATA[<p>Do cyborgs really need to eat? Granted, there&rsquo;s no practical way to answer this question at the moment,&nbsp;<a href="http://www.hansonrobotics.com/robot-gallery/">but with robotics technology improving rapidly each year</a>, the question of fuel sources is a fascinating one. Is organic fuel for cyborgs a necessity? Couldn&rsquo;t they just power up throughout the day by plugging in a USB or a charger? ECE Professor Ayanna Howard was one of the scientists&nbsp;game enough&nbsp;to lend their knowledge and experience to speculate on this topic.</p>]]></summary>  <dateline>2016-10-21T00:00:00-04:00</dateline>  <iso_dateline>2016-10-21T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-10-21 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[The very real question of organic fuel and advancing robotics technology]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>      </media>  <hg_media>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1255"><![CDATA[School of Electrical and Computer Engineering]]></group>      </groups>  <categories>      </categories>  <news_terms>      </news_terms>  <keywords>          <keyword tid="825"><![CDATA[Ayanna Howard]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="172551"><![CDATA[cyborg]]></keyword>          <keyword tid="166855"><![CDATA[School of Electrical and Computer Engineering]]></keyword>          <keyword tid="109"><![CDATA[Georgia Tech]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="583105">  <title><![CDATA[Robotic Cleaning Technique Could Automate Neuroscience Research]]></title>  <uid>27303</uid>  <body><![CDATA[<p>For scientists listening in on the faint whispers of brain neurons, a first-ever robotic technique for cleaning the tiny devices that record the signals could facilitate a new level of automation in neuroscience research. That could accelerate the gathering of information used to map the functions of brain cells and ultimately provide a better understanding what&rsquo;s going on between our ears.</p><p>The technique would be used in a recording method known as patch-clamping, in which a tiny liquid-filled glass pipette is connected to individual neurons. Since patch-clamping was invented three decades ago, the technique has required changing pipettes between recordings &ndash; a manual process that slows research. Now, a robotic cleaning technique developed by researchers at the Georgia Institute of Technology allows the pipettes to be reused for as many as 11 recordings &ndash; and potentially more &ndash; allowing the recording to be more automated.</p><p>&ldquo;This is a step toward revolutionizing the robotic techniques in neuroscience,&rdquo; said <a href="http://www.me.gatech.edu/faculty/forest">Craig Forest</a>, associate professor in Georgia Tech&rsquo;s <a href="http://www.me.gatech.edu">George W. Woodruff School of Mechanical Engineering.</a> &ldquo;We want to be able to put samples into our machine and walk away while it records 50 or even 100 neurons. This could enable for neuroscience the kind of research automation we&rsquo;ve seen in other fields such as molecular biology, dramatically expanding our ability to listen in on brain signals.&rdquo;</p><p>Supported by the National Institutes of Health and the Allen Institute for Brain Science, the research was reported October 11 in the journal <em>Scientific Reports</em>. Based on their cleaning technique and earlier innovations that automated the process of connecting the pipettes to cells, the Georgia Tech researchers have demonstrated what&rsquo;s believed to be the first robot to perform sequential patch-clamp recording in cell culture, brain slices, and in the living brain &ndash; without a human operator.</p><p>To share what it&rsquo;s doing, the patch-clamping robot &ndash; known as &ldquo;patcherBot&rdquo; &ndash; has its own Twitter account to automatically report on every cell it records. &ldquo;This is the first social neuroscience robot,&rdquo; said Forest.</p><p>Patch-clamping is the gold standard for stimulating and recording signals from neurons and other cells. It involves touching a glass pipette with a tip just one micron in diameter to the cell membrane, creating a tight seal that provides a direct electrical connection to the insides of the cell. The work is extremely meticulous and time-consuming, though a recent robotic technology termed the Autopatcher, also out of Forest&rsquo;s lab, has automated parts of the process.</p><p>Because cellular debris could prevent the tight connection to cells, researchers have had to replace the pipettes with new ones for each recording. But while conducting patch-clamping, graduate research assistant Ilya Kolb began to question the conventional wisdom that the pipettes could not be used more than once. He knew about detergents used to clean laboratory glassware, and set to work assessing whether or not these agents to be used in a robotic cleaning process.</p><p>&ldquo;If you could clean the pipette automatically after each recording, you could just tell the Autopatcher to go back to cells again and again,&rdquo; Forest explained. &ldquo;You wouldn&rsquo;t even have to be in the room anymore. You could set this up before you leave the lab for the day, and when you returned the next morning, you&rsquo;d have recorded 50 or 100 cells.&rdquo;</p><p>Kolb tested eight cleaning solutions and found one &ndash; Alconox &ndash; which successfully removed the debris. He reprogrammed the software operating the Autopatcher to add cleaning and rinsing steps between each recording. The new robot dips the pipette into a detergent solution located in a well next to the sample, creates a flow of fluid into and back out of the pipette, then moves the pipette to a rinse in a separate well. The entire cleaning process takes about a minute, which is as fast or even faster than a trained human operator.&nbsp;</p><p>The researchers compared the quality of the recordings made by the cleaned pipettes to those made with new ones.</p><p>&ldquo;When we patch with a fresh pipette and when we patch with a pipette that has been used and cleaned 11 times, the results are basically indistinguishable,&rdquo; said Kolb. &ldquo;We do see some degradation after 14 or more attempts, but we&rsquo;re hopeful that with improvements in the technique, we could reuse pipettes as many as 50 or 100 times.&rdquo;</p><p>Working with researchers at Emory University, the technique was tested to determine whether any remaining detergent residue could affect living cells. Results of the testing, supplemented with mass spectroscopy studies of the pipette fluid, found no adverse implications.</p><p>Georgia Tech has filed for patent protection on the new robotic technique, which allows technicians to simply choose the cells to be recorded using a microscope view &ndash; then let the machine work. The researchers hope the technique can be commercialized for use not only by the thousands of labs currently using patch-clamping, but also used to expand automated applications more broadly for pharmaceutical testing and other research.</p><p>&ldquo;If we can put this technology into a piece of equipment and have all the smarts provided by software, it could really democratize this area of research,&rdquo; said Forest. &ldquo;That&rsquo;s where we&rsquo;re headed in building tools that will make new science possible.&rdquo;</p><p><em>Research reported in this press release was supported by NIH Computational Neuroscience Training grant (DA032466-02) and BRAIN Initiative awards (U01-MH106027-01, R01-EY023173, R44-NS083108-03). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.</em></p><p><strong>CITATION</strong>: Ilya Kolb, W.A. Stoy, E.B. Rousseau, O.A. Moody, A. Jenkins and C.R. Forest, &ldquo;Cleaning patch-clamp pipettes for immediate reuse,&rdquo; (Scientific Reports 2016). <a href="http://dx.doi.org/10.1038/srep35001">http://dx.doi.org/10.1038/srep35001</a>.</p><p><strong>Research News<br />Georgia Institute of Technology<br />177 North Avenue<br />Atlanta, Georgia &nbsp;30332-0181</strong></p><p><strong>Media Relations Contacts</strong>: John Toon (404-894-6986) (jtoon@gatech.edu) or Ben Brumfield (404-385-1933) (ben.brumfield@comm.gatech.edu).</p><p><strong>Writer</strong>: John Toon</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1477419876</created>  <gmt_created>2016-10-25 18:24:36</gmt_created>  <changed>1479843950</changed>  <gmt_changed>2016-11-22 19:45:50</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[For scientists listening in on the faint whispers of brain neurons, a first-ever robotic technique for cleaning the tiny devices that record the signals could facilitate a new level of automation in neuroscience research.]]></teaser>  <type>news</type>  <sentence><![CDATA[For scientists listening in on the faint whispers of brain neurons, a first-ever robotic technique for cleaning the tiny devices that record the signals could facilitate a new level of automation in neuroscience research.]]></sentence>  <summary><![CDATA[<p>For scientists listening in on the faint whispers of brain neurons, a first-ever robotic technique for cleaning the tiny devices that record the signals could facilitate a new level of automation in neuroscience research. That could accelerate the gathering of information used to map the functions of brain cells and ultimately provide a better understanding what&rsquo;s going on between our ears.</p>]]></summary>  <dateline>2016-10-25T00:00:00-04:00</dateline>  <iso_dateline>2016-10-25T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-10-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News</p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>583094</item>          <item>583096</item>          <item>583097</item>          <item>583099</item>          <item>583101</item>      </media>  <hg_media>          <item>          <nid>583094</nid>          <type>image</type>          <title><![CDATA[Patch-clamping equipment]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[patch-clamp4270.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/patch-clamp4270.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/patch-clamp4270.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/patch-clamp4270.jpg?itok=IljVxeCy]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Patch-clamping equipment]]></image_alt>                    <created>1477419024</created>          <gmt_created>2016-10-25 18:10:24</gmt_created>          <changed>1477419024</changed>          <gmt_changed>2016-10-25 18:10:24</gmt_changed>      </item>          <item>          <nid>583096</nid>          <type>image</type>          <title><![CDATA[Patch-clamping equipment2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[patch-clamp4266.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/patch-clamp4266.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/patch-clamp4266.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/patch-clamp4266.jpg?itok=Gzfx0WJ8]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Sample holder for patch-clamping]]></image_alt>                    <created>1477419148</created>          <gmt_created>2016-10-25 18:12:28</gmt_created>          <changed>1477419148</changed>          <gmt_changed>2016-10-25 18:12:28</gmt_changed>      </item>          <item>          <nid>583097</nid>          <type>image</type>          <title><![CDATA[Patch-clamping equipment3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[patch-clamp4251.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/patch-clamp4251.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/patch-clamp4251.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/patch-clamp4251.jpg?itok=lFey35MI]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Patch-clamping setup]]></image_alt>                    <created>1477419228</created>          <gmt_created>2016-10-25 18:13:48</gmt_created>          <changed>1477419228</changed>          <gmt_changed>2016-10-25 18:13:48</gmt_changed>      </item>          <item>          <nid>583099</nid>          <type>image</type>          <title><![CDATA[Patch-clamping researchers]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[patch-clamp4296.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/patch-clamp4296.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/patch-clamp4296.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/patch-clamp4296.jpg?itok=MnduTaKx]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Researchers for patch-clamping]]></image_alt>                    <created>1477419313</created>          <gmt_created>2016-10-25 18:15:13</gmt_created>          <changed>1477419313</changed>          <gmt_changed>2016-10-25 18:15:13</gmt_changed>      </item>          <item>          <nid>583101</nid>          <type>image</type>          <title><![CDATA[SEM image of patch-clamping pipettes]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[SEM.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/SEM.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/SEM.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/SEM.png?itok=xfLdEE22]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1477419398</created>          <gmt_created>2016-10-25 18:16:38</gmt_created>          <changed>1477419398</changed>          <gmt_changed>2016-10-25 18:16:38</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="138"><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>      </news_terms>  <keywords>          <keyword tid="172583"><![CDATA[patch-clamping]]></keyword>          <keyword tid="172585"><![CDATA[patcherBot]]></keyword>          <keyword tid="1304"><![CDATA[neuroscience]]></keyword>          <keyword tid="7276"><![CDATA[neuron]]></keyword>          <keyword tid="12243"><![CDATA[brain research]]></keyword>          <keyword tid="12333"><![CDATA[Craig Forest]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>          <topic tid="71881"><![CDATA[Science and Technology]]></topic>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="583115">  <title><![CDATA[Kelly Distinguished Lecture Scheduled for November 18]]></title>  <uid>27255</uid>  <body><![CDATA[<p>IRIM&rsquo;s new lecture series, the Kelly Distinguished Lecture on Robots and Jobs, features preeminent scholars in fields of significance to robotics. The visiting lecturers, in addition to presenting seminars on topics relevant to robots in the workplace, participate in informal discussions with Georgia Tech faculty and students.</p><p>The second Kelly Distinguished Lecturer, Attorney Garry G. Mathiason, will present &ldquo;The Future Has Arrived: Integrating Transformative Technologies in the Workplace&rdquo;&nbsp;on Friday, November 18 in the GTRI Conference Center Auditorium.</p><h4><strong>Event Details</strong></h4><p>Friday, November 18, 2016 &bull; 3:00 p.m.<br />GTRI Conference Center Auditorium<br />Reception Immediately Following Lecture in the Conference Center Atrium/Pre-Function Area</p><ul><li>To learn more about the lecture series or watch the first Kelly lecture, please visit the&nbsp;<a href="http://robotics.gatech.edu/outreach/kellylecture">IRIM website</a>.</li></ul><p><strong>Bio</strong></p><p>Garry G. Mathiason is a senior class action litigator and strategist at Littler Mendelson, resident in the San Francisco office. He has personally supervised the firm&rsquo;s attorneys on more than 1,000 employment and labor litigation matters and currently defends employers in complex wage and hour and discrimination class action cases. He also represents clients before the National Labor Relations Board (NLRB), in arbitration, mediation, and collective bargaining and offers advice on numerous topics, including robotics employment law.</p><p>Mathiason co-chairs Littler&rsquo;s Robotics Practice Group, providing legal advice and representation to the legal industry, as well as employers deploying this technology in the workplace.&nbsp;He also oversees the Littler Corporate Compliance and Ethics Practice Group and originated the Contingent Workforce Practice Group. He is a founder of and serves on the board of NAVEX Global, the ethics and compliance experts. The legal technology compliance company provides superior legal compliance solutions through an array of GRC products and services. With more than 8,000 corporate clients in over 200 countries and 75% of the Fortune 100 companies, NAVEX Global represents the largest ethics and compliance community in the world. Over the course of his career, Mathiason has argued before the United States Supreme Court, the California Supreme Court, and other district, superior, circuit and appellate courts. He also practices before the National Labor Relations Board, the Equal Employment Opportunity Commission, the California Labor Commissioner, and the California Public Employment Relations Board.</p><p>Based on his litigation experience and a career-long effort to translate employment and labor law issues, and developments into practicable and learnable modules for human resource professionals, Mathiason has originated several of Littler&rsquo;s preventive employment law programs and structures in-house employment law training programs. He regularly&nbsp;delivers presentations&nbsp;before individual corporations, bar associations, national and local human resources groups, and other professional organizations.</p><p>Widely recognized as one of the leading authorities on employment law trends in the United States, Mathiason has written more than 100 articles and given several hundred presentations on the future of employment law. He serves as chair of the Open Compliance and Ethics Group&rsquo;s (OCEG) Employment and Labor Law Domain. In this capacity, he writes legal requirements and guidelines for business. He also spoke at the public presentation of the Employment Law Domain of OCEG. Throughout his career, he has researched, addressed and written extensively on ways in which business leaders can build ethical organizations and best ensure compliance with employment and labor law.</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1477422222</created>  <gmt_created>2016-10-25 19:03:42</gmt_created>  <changed>1477424558</changed>  <gmt_changed>2016-10-25 19:42:38</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Attorney Garry G. Mathiason, will present “The Future Has Arrived: Integrating Transformative Technologies in the Workplace” on Friday, November 18. ]]></teaser>  <type>news</type>  <sentence><![CDATA[Attorney Garry G. Mathiason, will present “The Future Has Arrived: Integrating Transformative Technologies in the Workplace” on Friday, November 18. ]]></sentence>  <summary><![CDATA[<p>IRIM&rsquo;s new lecture series, the Kelly Distinguished Lecture on Robots and Jobs, features preeminent scholars in fields of significance to robotics. The visiting lecturers, in addition to presenting seminars on topics relevant to robots in the workplace, participate in informal discussions with Georgia Tech faculty and students.</p>]]></summary>  <dateline>2016-10-25T00:00:00-04:00</dateline>  <iso_dateline>2016-10-25T00:00:00-04:00</iso_dateline>  <gmt_dateline>2016-10-25 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Josie Giles<br />IRIM Marketing Communications Mgr.<br /><a href="mailto:josie@gatech.edu">josie@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>581971</item>          <item>583113</item>      </media>  <hg_media>          <item>          <nid>581971</nid>          <type>image</type>          <title><![CDATA[Garry G. Mathiason]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[GGM Headshot.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/GGM%20Headshot.jpg]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/GGM%20Headshot.jpg]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/GGM%2520Headshot.jpg?itok=eCYr_DF1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Garry G. Mathiason]]></image_alt>                    <created>1475338912</created>          <gmt_created>2016-10-01 16:21:52</gmt_created>          <changed>1477421370</changed>          <gmt_changed>2016-10-25 18:49:30</gmt_changed>      </item>          <item>          <nid>583113</nid>          <type>image</type>          <title><![CDATA[Kelly Lecture Flyer]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[kelly poster 2-Fall 2016-MC.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/kelly%20poster%202-Fall%202016-MC.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/kelly%20poster%202-Fall%202016-MC.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/kelly%2520poster%25202-Fall%25202016-MC.png?itok=NDnUvsOt]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1477421229</created>          <gmt_created>2016-10-25 18:47:09</gmt_created>          <changed>1477421229</changed>          <gmt_changed>2016-10-25 18:47:09</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.littler.com/people/garry-g-mathiason]]></url>        <title><![CDATA[Garry G. Mathiason]]></title>      </link>          <link>        <url><![CDATA[http://robotics.gatech.edu/outreach/kellylecture]]></url>        <title><![CDATA[Kelly Lecture Series]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>          <keyword tid="1808"><![CDATA[graduate students]]></keyword>          <keyword tid="170003"><![CDATA[Kelly Lecture]]></keyword>          <keyword tid="81491"><![CDATA[Institute for Robotics and Intelligent Machines (IRIM)]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="583189">  <title><![CDATA[IRIM Offers New Opportunity for Visiting Faculty Fellows]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>IRIM&rsquo;s Visiting Faculty Fellows program supports extended visits (one to six months) to the Georgia Tech Atlanta campus by faculty at other institutions or industry/government laboratories who are engaged in research activities focusing on robotics. IRIM will provide Visiting Fellows with partial salary support, along with support for travel and living expenses. Visiting Fellows are encouraged to interact with IRIM faculty and students and will have the opportunity to give a two or three day &ldquo;mini-tutorial&rdquo; on their current research during their stay at Georgia Tech.</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1477568920</created>  <gmt_created>2016-10-27 11:48:40</gmt_created>  <changed>1477569105</changed>  <gmt_changed>2016-10-27 11:51:45</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[2021 Impact Report]]></publication>  <article_dateline>2016-10-27T00:00:00-04:00</article_dateline>  <iso_article_dateline>2016-10-27T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2016-10-27T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://robotics.gatech.edu/faculty/fellows]]></article_url>  <media>          <item><![CDATA[583190]]></item>      </media>  <hg_media>          <item>          <nid>583190</nid>          <type>image</type>          <title><![CDATA[IRIM Faculty Fellows Flyer]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[IRIM-Faculty-Fellows.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/IRIM-Faculty-Fellows.png]]></image_path>            <image_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/images/IRIM-Faculty-Fellows.png]]></image_full_path>            <image_740><![CDATA[http://hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/IRIM-Faculty-Fellows.png?itok=P0aq9hVk]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[IRIM Faculty Fellows Flyer]]></image_alt>                              <created>1477569060</created>          <gmt_created>2016-10-27 11:51:00</gmt_created>          <changed>1477569060</changed>          <gmt_changed>2016-10-27 11:51:00</gmt_changed>      </item>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="172599"><![CDATA[IRIM faculty fellows]]></keyword>          <keyword tid="78811"><![CDATA[Institute for Robotics and Intelligent Machines]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="583922">  <title><![CDATA[Grad Student Vivian Chu Named One of the “25 Women in Robotics You Need to Know About” in 2016]]></title>  <uid>27255</uid>  <body><![CDATA[<p>Georgia Tech graduate student Vivian Chu joins a distinguished group of roboticists named on Robohub&rsquo;s 2016 list of &ldquo;<a href="http://robohub.org/25-women-in-robotics-you-need-to-know-about-2016/" target="_blank">25 Women In Robotics You Need To Know About</a>.&rdquo;</p><p>As the only student to make the list, two-dozen professors, government administrators, business executives, and a provost surround Chu, who is completing her research focused on enabling robots to work in unstructured human environments.</p><p>Presented by&nbsp;Robohub in celebration of Ada Lovelace Day, the fourth-annual list showcases&nbsp;women working in research, development, and commercialization of robotics.</p><p>&ldquo;Role models are important,&rdquo; writes Robohub. &ldquo;Countess Ada Lovelace, the world&rsquo;s first computer programmer and an extraordinary mathematician, faced an uphill battle in the days when women were not encouraged to pursue a career in science. Fast forward 200 years, there are still not enough women in science, technology, engineering or math (STEM). One key reason is clear: a severe lack of visible female role models. Women in STEM need to be equally represented at conferences, keynotes, magazine covers, or stories about technology.&rdquo;</p><p>Chu, who grew up in the heart of Silicon Valley in San Jose, Calif., says she is honored to be included on the list.</p><p>After completing her bachelor&rsquo;s degree in electrical engineering and computer science at the University of California, Berkeley, and her master&rsquo;s degree in robotics at the University of Pennsylvania, Chu came to Georgia Tech to pursue her doctorate work in the Socially Intelligent Machines Lab&nbsp;under the direction of&nbsp;<a href="http://robotics.gatech.edu/team/faculty/thomaz" target="_blank">Andrea L. Thomaz</a>, who was named to Robohub&rsquo;s inaugural list of &ldquo;25 Women in Robotics You Need to Know About&rdquo; in 2013. (Another Georgia Tech roboticist,&nbsp;<a href="http://robotics.gatech.edu/team/faculty/howard" target="_blank">Ayanna Howard</a>, made the 2014 list.)</p><p>Chu found choosing a Ph.D. program an incredibly difficult decision and had offers from many of the top institutes for robotics research. Ultimately, she chose Georgia Tech because of the &ldquo;world-class research being conducted, specifically in the realm of service robotics.&rdquo; Furthermore, she discovered the strong community the Institute offers. &ldquo;At GT, I feel supported and cared for by the faculty, department, and my fellow students,&rdquo; says Chu.</p><p>Magnus Egerstedt, executive director of the Institute for Robotics and Intelligent Machines, is especially pleased Chu chose Georgia Tech. &ldquo;I&rsquo;m impressed by all of our students, and Vivian&rsquo;s inclusion on this prestigious list is testament to the quality of not only our students and faculty, but also our Ph.D. program. Congratulations, Vivian!&rdquo;</p><p>IRIM recently had the opportunity to learn more about Chu and her future plans and vision for robotics in the future.</p><p><strong>Why did you choose robotics?</strong></p><p>I chose robotics because it allowed me to concretely see how engineering can be applied to solve real-world problems. Not just bits and bytes on a screen or signals on a breadboard, robotics is a field where the results of my work could physically change the environment around me. I knew I found the right field when I could be debugging or researching a solution and not notice that the entire day had gone by. I would have to force myself to stop and catch a few hours of sleep; I was so excited to continue the next day.</p><p><strong>What is your dream job/next move after completing the Robotics Ph.D. program at Georgia Tech?</strong></p><p>My dream job after the Ph.D. would be a job where I can work on hard problems truly worth solving. Specifically for me, this involves designing algorithms for robots so that they can work side-by-side with people and help in whatever way is necessary. My dream job would have me collaborating with the greatest minds in robotics. I would not only provide technical solutions to problems that need to be solved, but inspire the next generation of young minds to pursue STEM fields.</p><p><strong>How do you envision the world of robotics in the next 10 years?</strong></p><p>When I first started studying robotics, I would have never predicted how soon the technological advances that were being researched in labs would make their way into the world. I remain optimistic that this trend will continue over the next decade and those once-novel concepts, such as self-driving cars, will become reliable services. At the same time, I can foresee that we might encounter a period where people become disillusioned with robots. I often tell people that the big red button on our robots is mostly to protect the robot from destroying itself by running into a wall or a table &mdash; a far more likely scenario than it hurting someone.</p><p>In terms of the direction of research in robotics, I believe that human-robot interaction (HRI) will become a crucial component of the robotics field as we understand how robots and people can work together and benefit from this interaction. Furthermore, as sensor technology continues to improve, multi-sensory inputs will become commonplace on hardware platforms. Multi-sensory fusion is going to be immensely important for robots to truly be aware and robust in the world. We, as people, do not rely just on our vision or sense of touch; when people lose one or more of their senses, it becomes incredibly difficult to navigate the world. Robots that can successfully utilize multiple senses can greatly benefit from the added feedback to increase robustness when autonomously completing tasks.</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1479245987</created>  <gmt_created>2016-11-15 21:39:47</gmt_created>  <changed>1479308670</changed>  <gmt_changed>2016-11-16 15:04:30</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Vivian Chu named one of Robohub’s “25 Women in Robotics You Need to Know About”]]></teaser>  <type>news</type>  <sentence><![CDATA[Vivian Chu named one of Robohub’s “25 Women in Robotics You Need to Know About”]]></sentence>  <summary><![CDATA[<p>Georgia Tech graduate student Vivian Chu joins a distinguished group of roboticists named on Robohub&rsquo;s 2016 list of &ldquo;25 Women In Robotics You Need To Know About.</p>]]></summary>  <dateline>2016-11-15T00:00:00-05:00</dateline>  <iso_dateline>2016-11-15T00:00:00-05:00</iso_dateline>  <gmt_dateline>2016-11-15 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Josie Giles<br />IRIM Marketing Communications Mgr.<br /><a href="mailto:josie@gatech.edu">josie@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>      </media>  <hg_media>      </hg_media>  <related>          <link>        <url><![CDATA[http://robohub.org/25-women-in-robotics-you-need-to-know-about-2016/]]></url>        <title><![CDATA[25 Women in Robotics You Need to Know About]]></title>      </link>          <link>        <url><![CDATA[http://vchutech.wixsite.com/vivianchu]]></url>        <title><![CDATA[Vivian Chu]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="172726"><![CDATA[Vivian Chu]]></keyword>          <keyword tid="276"><![CDATA[Awards]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="106591"><![CDATA[25 Women in Robotics You Need to Know About]]></keyword>          <keyword tid="106581"><![CDATA[Robohub]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata><![CDATA[]]></userdata></node><node id="588428">  <title><![CDATA[Georgia Tech Research Institute Acquires TigerShark Unmanned Aerial Vehicles]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1488928363</created>  <gmt_created>2017-03-07 23:12:43</gmt_created>  <changed>1488928363</changed>  <gmt_changed>2017-03-07 23:12:43</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Georgia Tech Research Institute Acquires TigerShark Unmanned Aerial Vehicles]]></publication>  <article_dateline>2016-11-21T00:00:00-05:00</article_dateline>  <iso_article_dateline>2016-11-21T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2016-11-21T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[https://www.gtri.gatech.edu/casestudy/gtri-acquires-tigershark-uavs]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="78811"><![CDATA[Institute for Robotics and Intelligent Machines]]></keyword>          <keyword tid="78271"><![CDATA[IRIM]]></keyword>          <keyword tid="415"><![CDATA[Georgia Tech Research Institute]]></keyword>          <keyword tid="416"><![CDATA[GTRI]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node><node id="585196">  <title><![CDATA[What Scientists Are Learning about Cats’ Amazing, Velcro-like Tongues]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1481913365</created>  <gmt_created>2016-12-16 18:36:05</gmt_created>  <changed>1481913365</changed>  <gmt_changed>2016-12-16 18:36:05</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Roswell Biotechnologies]]></publication>  <article_dateline>2016-12-02T00:00:00-05:00</article_dateline>  <iso_article_dateline>2016-12-02T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2016-12-02T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[http://www.cbsnews.com/news/cat-tongues-velcro-like-robotics-developments/]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>      </categories>  <keywords>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata><![CDATA[]]></userdata></node></nodes>