{"682761":{"#nid":"682761","#data":{"type":"news","title":"Georgia Tech Team Takes Second Place at ICRA Robot Teleoperation Contest","body":[{"value":"\u003Cp\u003EAn algorithmic breakthrough from School of Interactive Computing researchers that\u0026nbsp;\u003Ca href=\u0022https:\/\/www.cc.gatech.edu\/news\/new-algorithm-teaches-robots-through-human-perspective\u0022\u003E\u003Cstrong\u003Eearned a Meta partnership\u003C\/strong\u003E\u003C\/a\u003Edrew more attention at the IEEE International Conference on Robotics and Automation (ICRA).\u003C\/p\u003E\u003Cp\u003EMeta announced in February its partnership with the labs of professors\u0026nbsp;\u003Ca href=\u0022https:\/\/faculty.cc.gatech.edu\/~danfei\/\u0022\u003E\u003Cstrong\u003EDanfei Xu\u003C\/strong\u003E\u003C\/a\u003E and\u0026nbsp;\u003Ca href=\u0022https:\/\/faculty.cc.gatech.edu\/~judy\/\u0022\u003E\u003Cstrong\u003EJudy Hoffman\u003C\/strong\u003E\u003C\/a\u003E on a novel computer vision-based algorithm called EgoMimic. It enables robots to learn new skills by imitating human tasks from first-person video footage captured by Meta\u2019s Aria smart glasses.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EXu\u2019s\u0026nbsp;\u003Ca href=\u0022https:\/\/rl2.cc.gatech.edu\/\u0022\u003E\u003Cstrong\u003ERobot Learning and Reasoning Lab (RL2)\u003C\/strong\u003E\u003C\/a\u003E displayed EgoMimic in action at ICRA May 19-23 at the World Congress Center in Atlanta.\u003C\/p\u003E\u003Cp\u003ELawrence Zhu, Pranav Kuppili, and Patcharapong \u201cElmo\u201d Aphiwetsa \u2014 students from Xu\u2019s lab \u2014 used Egomimic to compete in a robot teleoperation contest at ICRA. The team finished second in the event titled What Bimanual Teleoperation and Learning from Demonstration Can Do Today, earning a $10,000 cash prize.\u003C\/p\u003E\u003Cp\u003ETeams were challenged to perform tasks by remotely controlling a robot gripper. The robot had to fold a tablecloth, open a vacuum-sealed container, place an object into the container, and then reseal it in succession without any errors.\u003C\/p\u003E\u003Cp\u003ETeams completed the tasks as many times as possible in 30 minutes, earning points for each successful attempt.\u003C\/p\u003E\u003Cp\u003EThe competition also offered different challenge levels that increased the points awarded. Teams could directly operate the robot with a full workstation view and receive one point for each task completion. Or, as the RL2 team chose, teams could opt for the second challenge level.\u003C\/p\u003E\u003Cp\u003EThe second level required an operator to control the task with no view of the workstation except for what was provided to through a video feed. The RL2 team completed the task seven times and received double points for the challenge level.\u003C\/p\u003E\u003Cp\u003EThe third challenge level required teams to operate remotely from another location. At this level, teams could earn four times the number of points for each successful task completed. The fourth level challenged teams to deploy an algorithm for task performance and awarded eight points for each completion.\u003C\/p\u003E\u003Cp\u003EUsing two of Meta\u2019s Quest wireless controllers, Zhu controlled the robot under the direction of Aphiwetsa, while Kuppili monitored the coding from his laptop.\u003C\/p\u003E\u003Cp\u003E\u201cIt\u2019s physically difficult to teleoperate for half an hour,\u201d Zhu said. \u201cMy hands were shaking from holding the controllers in the air for that long.\u201d\u003C\/p\u003E\u003Cp\u003EBeing in constant communication with Aphiwetsa helped him stay focused throughout the contest.\u003C\/p\u003E\u003Cp\u003E\u201cI helped him strategize the teleoperation and noticed he could skip some of the steps in the folding,\u201d Aphiwetsa said. \u201cThere were many ways to do it, so I just told him what he could fix and how to do it faster.\u201d\u003C\/p\u003E\u003Cp\u003EZhu said he and his team had intended to tackle the fourth challenge level with the EgoMimic algorithm. However, due to unexpected time constraints, they decided to switch to the second level the day before the competition due to unexpected time constraints.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u201cI think we realized the day before the competition training the robot on our model would take a huge amount of time,\u201d Zhu said. \u201cWe decided to go for the teleoperation and started practicing.\u201d\u003C\/p\u003E\u003Cp\u003EHe said the team wants to tackle the highest challenge level and use a training model for next year\u2019s ICRA competition in Vienna, Austria.\u003C\/p\u003E\u003Cp\u003EICRA is the world\u2019s largest robotics conference, and\u0026nbsp;\u003Ca href=\u0022https:\/\/www.cc.gatech.edu\/news\/georgia-tech-leads-robotics-world-converges-atlanta-icra-2025\u0022\u003E\u003Cstrong\u003EAtlanta hosted the event\u003C\/strong\u003E\u003C\/a\u003E for the third time in its history, drawing a record-breaking attendance of over 7,000.\u003C\/p\u003E","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EStudents from Georgia Tech\u0027s Robot Learning and Reasoning Lab earned second place and a $10,000 cash prize in a robot teleoperation contest at the 2025 International Conference on Robotics and Automation in Atlanta. The RL2 lab announced a partnership with Meta in February on a novel computer vision-based algorithm called EgoMimic. It enables robots to learn new skills by imitating human tasks from first-person video footage captured by Meta\u2019s Aria smart glasses.\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"A Georgia Tech team earned second place in the ICRA Robot Teleoperation Contest for their EgoMimic algorithm, which allows robots to learn skills by mimicking human tasks from first-person video."}],"uid":"36530","created_gmt":"2025-06-11 15:24:42","changed_gmt":"2025-06-12 11:52:56","author":"Nathan Deen","boilerplate_text":"","field_publication":"","field_article_url":"","location":"Atlanta, GA","dateline":{"date":"2025-06-11T00:00:00-04:00","iso_date":"2025-06-11T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"677223":{"id":"677223","type":"image","title":"IMG_4291-2-copy.jpg","body":null,"created":"1749729142","gmt_created":"2025-06-12 11:52:22","changed":"1749729142","gmt_changed":"2025-06-12 11:52:22","alt":"ICRA","file":{"fid":"261102","name":"IMG_4291-2-copy.jpg","image_path":"\/sites\/default\/files\/2025\/06\/12\/IMG_4291-2-copy.jpg","image_full_path":"http:\/\/hg.gatech.edu\/\/sites\/default\/files\/2025\/06\/12\/IMG_4291-2-copy.jpg","mime":"image\/jpeg","size":151809,"path_740":"http:\/\/hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2025\/06\/12\/IMG_4291-2-copy.jpg?itok=Ag2Xn9Oj"}}},"media_ids":["677223"],"groups":[{"id":"47223","name":"College of Computing"},{"id":"1188","name":"Research Horizons"},{"id":"50876","name":"School of Interactive Computing"}],"categories":[{"id":"153","name":"Computer Science\/Information Technology and Security"},{"id":"152","name":"Robotics"},{"id":"193158","name":"Student Competition Winners (academic, innovation, and research)"}],"keywords":[{"id":"181920","name":"cc-research; ic-ai-ml; ic-robotics"},{"id":"187812","name":"artificial intelligence (AI)"},{"id":"192863","name":"go-ai"},{"id":"187915","name":"go-researchnews"},{"id":"9153","name":"Research Horizons"},{"id":"167585","name":"student competition"}],"core_research_areas":[{"id":"193655","name":"Artificial Intelligence at Georgia Tech"},{"id":"39521","name":"Robotics"}],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}