{"688391":{"#nid":"688391","#data":{"type":"news","title":"Robot Pollinator Could Produce More, Better Crops for Indoor Farms","body":[{"value":"\u003Cp\u003EA new robot could solve one of the biggest challenges facing indoor farmers: manual pollination.\u003C\/p\u003E\u003Cp\u003EIndoor farms, also known as vertical farms, are popular among agricultural researchers and are expanding across the agricultural industry. Some benefits they have over outdoor farms include:\u003C\/p\u003E\u003Cul\u003E\u003Cli\u003EYear-round production of food crops\u003C\/li\u003E\u003Cli\u003ELess water and land requirements\u003C\/li\u003E\u003Cli\u003ENot needing pesticides\u003C\/li\u003E\u003Cli\u003EReducing carbon emissions from shipping\u003C\/li\u003E\u003Cli\u003EReducing food waste\u003C\/li\u003E\u003C\/ul\u003E\u003Cp\u003EAdditionally,\u0026nbsp;\u003Ca href=\u0022https:\/\/www.agritecture.com\/blog\/2021\/7\/20\/5-ways-vertical-farming-is-improving-nutrition\u0022\u003E\u003Cstrong\u003Esome studies\u003C\/strong\u003E\u003C\/a\u003E indicate that indoor farms produce more nutritious food for urban communities.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EHowever, these farms are often inaccessible to birds, bees, and other natural pollinators, leaving the pollination process to humans. The tedious process must be completed by hand for each flower to ensure the indoor crop flourishes.\u003C\/p\u003E\u003Cp\u003E\u003Ca href=\u0022https:\/\/research.gatech.edu\/people\/ai-ping-hu\u0022\u003E\u003Cstrong\u003EAi-Ping Hu\u003C\/strong\u003E\u003C\/a\u003E, a principal research engineer at the Georgia Tech Research Institute (GTRI), has spent years exploring methods to efficiently pollinate flowering plants and food crops in indoor farms to find a way to efficiently pollinate flower plants and food crops in indoor farms.\u003C\/p\u003E\u003Cp\u003EHu,\u0026nbsp;\u003Ca href=\u0022https:\/\/research.gatech.edu\/people\/shreyas-kousik\u0022\u003E\u003Cstrong\u003EAssistant Professor Shreyas Kousik of the George W. Woodruff School of Mechanical Engineering\u003C\/strong\u003E\u003C\/a\u003E, and a rotating group of student interns have developed a robot prototype that may be up to the task.\u003C\/p\u003E\u003Cp\u003EThe robot can efficiently pollinate plants that have both male and female reproductive parts. These plants only require pollen to be transferred from one part to the other rather than externally from another flower.\u003C\/p\u003E\u003Cp\u003ENatural pollinators perform this task outdoors, but Hu said indoor farmers often use a paintbrush or electric tootbrush to ensure these flowers are pollinated.\u0026nbsp;\u003C\/p\u003E\u003Ch4\u003E\u003Cstrong\u003EKnowing the Pose\u003C\/strong\u003E\u003C\/h4\u003E\u003Cp\u003EAn early challenge the research team addressed was teaching the robot to identify the \u201cpose\u201d of each flower. Pose refers to a flower\u2019s orientation, shape, and symmetry. Knowing these details ensures precise delivery of the pollen to maximize reproductive success.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u201cIt\u2019s crucial to know exactly which way the flowers are facing,\u201d Hu said.\u003C\/p\u003E\u003Cp\u003E\u201cYou want to approach the flower from the front because that\u2019s where all the biological structures are. Knowing the pose tells you where the stem is. Our device grasps the stem and shakes it to dislodge the pollen.\u003C\/p\u003E\u003Cp\u003E\u201cEvery flower is going to have its own pose, and you need to know what that is within at least 10 degrees.\u201d\u003C\/p\u003E\u003Ch4\u003E\u003Cstrong\u003EComputer Vision Breakthrough\u003C\/strong\u003E\u003C\/h4\u003E\u003Cp\u003E\u003Cstrong\u003EHarsh Muriki\u003C\/strong\u003E is a robotics master\u2019s student at Georgia Tech\u2019s School of Interactive Computing, who used computer vision to solve the pose problem while interning for Hu and GTRI.\u003C\/p\u003E\u003Cp\u003EMuriki attached a camera to a FarmBot to capture images of strawberry plants from dozens of angles in a small garden in front of Georgia Tech\u2019s Food Processing Technology Building. The\u0026nbsp;\u003Ca href=\u0022https:\/\/farm.bot\/?srsltid=AfmBOoqh1Z8vSs3WflZisgw5DsOUSo8shD4VtY0Y8_VmVpVyt0Iwalxo\u0022\u003E\u003Cstrong\u003EFarmBot\u003C\/strong\u003E\u003C\/a\u003E is an XYZ-axis robot that waters and sprays pesticides on outdoor gardens, though it is not capable of pollination.\u003C\/p\u003E\u003Cp\u003E\u201cWe reconstruct the images of the flower into a 3D model and use a technique that converts the 3D model into multiple 2D images with depth information,\u201d Muriki said. \u201cThis enables us to send them to object detectors.\u201d\u003C\/p\u003E\u003Cp\u003EMuriki said he used a real-time object detection system called YOLO (You Only Look Once) to classify objects. YOLO is known for identifying and classifying objects in a single pass.\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EVed Sengupta\u003C\/strong\u003E, a computer engineering major who interned with Muriki, fine-tuned the algorithms that converted 3D images into 2D.\u003C\/p\u003E\u003Cp\u003E\u201cThis was a crucial part of making robot pollination possible,\u201d Sengupta said. \u201cThere is a big gap between 3D and 2D image processing.\u003C\/p\u003E\u003Cp\u003E\u201cThere\u2019s not a lot of data on the internet for 3D object detection, but there\u2019s a ton for 2D. We were able to get great results from the converted images, and I think any sector of technology can take advantage of that.\u201d\u003C\/p\u003E\u003Cp\u003ESengupta, Muriki, and Hu co-authored a paper about their work that was accepted to the 2025 International Conference on Robotics and Automation (ICRA) in Atlanta.\u003C\/p\u003E\u003Ch4\u003E\u003Cstrong\u003EMeasuring Success\u003C\/strong\u003E\u003C\/h4\u003E\u003Cp\u003EThe pollination robot, built in Kousik\u2019s Safe Robotics Lab, is now in the prototype phase.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EHu said the robot can do more than pollinate. It can also analyze each flower to determine how well it was pollinated and whether the chances for reproduction are high.\u003C\/p\u003E\u003Cp\u003E\u201cIt has an additional capability of microscopic inspection,\u201d Hu said. \u201cIt\u2019s the first device we know of that provides visual feedback on how well a flower was pollinated.\u201d\u003C\/p\u003E\u003Cp\u003EFor more information about the robot, visit the\u0026nbsp;\u003Ca href=\u0022https:\/\/saferoboticslab.me.gatech.edu\/research\/towards-robotic-pollination\/\u0022\u003E\u003Cstrong\u003ESafe Robotics Lab project page\u003C\/strong\u003E\u003C\/a\u003E.\u003C\/p\u003E","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EManual pollination is one of the biggest challenges for indoor farmers. These farms are often inaccessible to birds, bees, and other natural pollinators, leaving the pollination process to humans. The tedious process must be completed by hand for each flower to ensure the indoor crop flourishes.\u003C\/p\u003E\u003Cp\u003EA Georgia Tech research led by Ai-Ping Hu and Shreyas Kousik team is working to solve that. A robot they\u0027ve developed can efficiently pollinate plants that have both male and female reproductive parts. These plants only require pollen to be transferred from one part to the other rather than externally from another flower.\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"A research team that expands GTRI, the College of Engineering, and the College of Computing have developed a robot capable of pollinating flowers in indoor farms."}],"uid":"36530","created_gmt":"2026-02-19 18:58:12","changed_gmt":"2026-03-20 12:54:01","author":"Nathan Deen","boilerplate_text":"","field_publication":"","field_article_url":"","location":"Atlanta, GA","dateline":{"date":"2026-02-19T00:00:00-05:00","iso_date":"2026-02-19T00:00:00-05:00","tz":"America\/New_York"},"extras":[],"hg_media":{"679370":{"id":"679370","type":"image","title":"Harsh-Muriki_86A0006.jpg","body":null,"created":"1771527500","gmt_created":"2026-02-19 18:58:20","changed":"1771527500","gmt_changed":"2026-02-19 18:58:20","alt":"Harsh Muriki","file":{"fid":"263520","name":"Harsh-Muriki_86A0006.jpg","image_path":"\/sites\/default\/files\/2026\/02\/19\/Harsh-Muriki_86A0006.jpg","image_full_path":"http:\/\/hg.gatech.edu\/\/sites\/default\/files\/2026\/02\/19\/Harsh-Muriki_86A0006.jpg","mime":"image\/jpeg","size":140654,"path_740":"http:\/\/hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2026\/02\/19\/Harsh-Muriki_86A0006.jpg?itok=rd0rv1Yt"}}},"media_ids":["679370"],"groups":[{"id":"47223","name":"College of Computing"},{"id":"1188","name":"Research Horizons"},{"id":"50876","name":"School of Interactive Computing"}],"categories":[{"id":"194606","name":"Artificial Intelligence"},{"id":"153","name":"Computer Science\/Information Technology and Security"},{"id":"145","name":"Engineering"},{"id":"135","name":"Research"},{"id":"152","name":"Robotics"}],"keywords":[{"id":"9153","name":"Research Horizons"},{"id":"187991","name":"go-robotics"},{"id":"192863","name":"go-ai"},{"id":"11506","name":"computer vision"},{"id":"180840","name":"computer vision systems"},{"id":"669","name":"agriculture"},{"id":"194392","name":"AI in Agriculture"},{"id":"170254","name":"urban gardening"},{"id":"94111","name":"farming"},{"id":"14913","name":"urban farming"},{"id":"23911","name":"bees"},{"id":"6660","name":"flowers"},{"id":"187915","name":"go-researchnews"}],"core_research_areas":[{"id":"193655","name":"Artificial Intelligence at Georgia Tech"},{"id":"193653","name":"Georgia Tech Research Institute"},{"id":"39521","name":"Robotics"}],"news_room_topics":[{"id":"71911","name":"Earth and Environment"}],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003E\u003Ca href=\u0022mailto:ndeen6@gatech.edu\u0022\u003ENathan Deen\u003C\/a\u003E\u003Cbr\u003ECollege of Computing\u003Cbr\u003EGeorgia Tech\u003C\/p\u003E","format":"limited_html"}],"email":[],"slides":[],"orientation":[],"userdata":""}}}