{"623044":{"#nid":"623044","#data":{"type":"news","title":"Robot Able to Instantly Identify Household Materials Using Near-Infrared Light ","body":[{"value":"\u003Cp\u003ERobots aren\u0026rsquo;t yet household fixtures, but Georgia Tech researchers have already come up with a way domestic bots might recognize materials around the home.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EUsing near-infrared light, similar to what\u0026rsquo;s used in TV remotes, the robot can identify common materials used in household objects to better inform its actions. This might allow intelligent machines to understand, for example, the right bowl (paper versus metal) to put in a microwave or how hard to grasp a cup made of glass versus plastic.\u003C\/p\u003E\r\n\r\n\u003Cp\u003ETo classify materials, the researchers first determined hundreds of light wavelengths reflected from five common materials \u0026ndash; paper, wood, plastic, metal, and fabric. With this information, they trained a neural network on 10,000 examples in order to create a machine-learning (ML) model that could be used by a robot to quickly identify a material.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EAccording to the researchers, a robot using their new ML model can identify materials without it first having to touch an object, a useful function for handling potentially fragile items. To do so, the robot holds a small spectrometer near an object to get a quick light measurement, which is then processed to identify the material.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;Robots currently use conventional cameras or haptic sensing - the sense of touch - to estimate a material type,\u0026rdquo; said \u003Cstrong\u003EZackory Erickson\u003C\/strong\u003E, the first author on the research paper detailing the new work and Georgia Tech robotics Ph.D. student.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;This is the first time that we know of that spectroscopy and machine learning have been used for material classification in robotics research, and our accuracy is on par with existing methods.\u0026rdquo;\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe team\u0026rsquo;s new ML model yielded the best results using spectrometer measurements from near-infrared light. In fact, the accuracy was 99.9 percent with the full dataset of 10,000 measurements from 50 objects that the model had been trained on.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;While human eyes typically use three color receptors to see the world, our robot can be thought of as using hundreds of color receptors to recognize materials,\u0026rdquo; said \u003Cstrong\u003ECharlie Kemp\u003C\/strong\u003E, associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University and part of the research team. \u0026ldquo;Instead of a conventional color camera that measures red, green, and blue light, our robot uses a spectrometer that measures light at hundreds of different wavelengths, some outside of the range of human vision.\u0026rdquo;\u003C\/p\u003E\r\n\r\n\u003Cp\u003ETo see how results would compare using only a single light reading from each object, the team also trained the model on just 50 measurements, one from each object. Interestingly, accuracy in identifying the correct material only dropped to 95 percent. When using a spectrometer reading from objects the machine learning model had never seen, the robot still achieved an 81.6 percent success rate.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;Spectroscopy presents a reliable and effective way for robots to estimate materials of household objects,\u0026rdquo; Erickson said. \u0026ldquo;We\u0026rsquo;ve demonstrated how a robot can use near-infrared spectroscopy to infer the materials of everyday objects like cups, bowls, and garments.\u0026rdquo;\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe research is published in the Proceedings of the 2019 International Conference on Robotics and Automation (ICRA) in the paper titled \u003Cem\u003EClassification of Household Materials via Spectroscopy\u003C\/em\u003E co-authored by \u003Ca href=\u0022http:\/\/zackory.com\/\u0022 target=\u0022_blank\u0022\u003E\u003Cstrong\u003EZackory Erickson\u003C\/strong\u003E\u003C\/a\u003E, \u003Cstrong\u003ENathan Luskey\u003C\/strong\u003E, \u003Ca href=\u0022https:\/\/www.cc.gatech.edu\/~chernova\/\u0022 target=\u0022_blank\u0022\u003E\u003Cstrong\u003ESonia Chernova\u003C\/strong\u003E\u003C\/a\u003E, and \u003Ca href=\u0022http:\/\/charliekemp.com\u0022 target=\u0022_blank\u0022\u003E\u003Cstrong\u003ECharlie Kemp\u003C\/strong\u003E\u003C\/a\u003E.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EFor more Georgia Tech research published at ICRA, as well as the entire conference program,\u0026nbsp;explore this \u003Ca href=\u0022https:\/\/public.tableau.com\/shared\/J22YXRJXM?:display_count=yes\u0026amp;:origin=viz_share_link\u0026amp;:showVizHome=no\u0022 target=\u0022_blank\u0022\u003Einteractive visualization\u003C\/a\u003E\u0026nbsp;from the GVU Center at Georgia Tech.\u003C\/p\u003E\r\n","summary":null,"format":"limited_html"}],"field_subtitle":[{"value":"No Contact is Required with Objects by Using Inexpensive, Handheld \u0027Light-Reading\u0027 Device"}],"field_summary":"","field_summary_sentence":[{"value":"Robots aren\u2019t yet household fixtures, but Georgia Tech researchers have already come up with a way domestic bots might recognize materials around the home."}],"uid":"27592","created_gmt":"2019-07-08 17:32:11","changed_gmt":"2019-07-17 20:55:36","author":"Joshua Preston","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2019-07-08T00:00:00-04:00","iso_date":"2019-07-08T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"623045":{"id":"623045","type":"image","title":"Robot Classifies Materials of Household Objects Using \u0027Light-Reading\u0027 Device","body":null,"created":"1562609057","gmt_created":"2019-07-08 18:04:17","changed":"1562609089","gmt_changed":"2019-07-08 18:04:49","alt":"","file":{"fid":"237265","name":"Robot classifies materials of household objects.png","image_path":"\/sites\/default\/files\/images\/Robot%20classifies%20materials%20of%20household%20objects.png","image_full_path":"http:\/\/hg.gatech.edu\/\/sites\/default\/files\/images\/Robot%20classifies%20materials%20of%20household%20objects.png","mime":"image\/png","size":1829642,"path_740":"http:\/\/hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/Robot%20classifies%20materials%20of%20household%20objects.png?itok=9t0CWmic"}}},"media_ids":["623045"],"related_links":[{"url":"https:\/\/www.youtube.com\/watch?v=fBv_xEai2AU","title":"VIDEO: Watch how GT researchers are bringing domestic bots one step closer to reality"},{"url":"https:\/\/www.spreaker.com\/user\/10751784\/tu-ep5-robot-instantly-identifies-materials","title":"Tech Unbound Podcast EP5: Robot Able to Instantly Identify Household Materials Without Touching Objects"}],"groups":[{"id":"1299","name":"GVU Center"},{"id":"576481","name":"ML@GT"},{"id":"50876","name":"School of Interactive Computing"}],"categories":[],"keywords":[{"id":"667","name":"robotics"}],"core_research_areas":[{"id":"39521","name":"Robotics"}],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003E\u003Ca href=\u0022mailto:jpreston@cc.gatech.edu\u0022\u003EJoshua Preston\u003C\/a\u003E\u003Cbr \/\u003E\r\nResearch Communications Manager, GVU Center\u003Cbr \/\u003E\r\n678.231.0787\u003C\/p\u003E\r\n","format":"limited_html"}],"email":["jpreston@cc.gatech.edu"],"slides":[],"orientation":[],"userdata":""}}}