{"686049":{"#nid":"686049","#data":{"type":"news","title":"A Flexible Lens Controlled By Light-Activated Artificial Muscles Promises to Let Soft Machines\u00a0See","body":[{"value":"\u003Cdiv class=\u0022theconversation-article-body\u0022\u003E\u003Cp\u003EInspired by the human eye, our biomedical engineering \u003Ca href=\u0022https:\/\/sites.google.com\/site\/thejialab\/home?authuser=2\u0022\u003Elab at Georgia Tech\u003C\/a\u003E has designed an \u003Ca href=\u0022https:\/\/doi.org\/10.1126\/scirobotics.adw8905\u0022\u003Eadaptive lens\u003C\/a\u003E made of soft, light-responsive, tissuelike materials.\u003C\/p\u003E\u003Cp\u003EAdjustable camera systems usually require a set of bulky, moving, solid lenses and a pupil in front of a camera chip to adjust focus and intensity. In contrast, human eyes perform these same functions using soft, flexible tissues in a highly compact form.\u003C\/p\u003E\u003Cp\u003EOur lens, called the photo-responsive hydrogel soft lens, or PHySL, replaces rigid components with soft polymers acting as artificial muscles. The polymers are composed of a \u003Ca href=\u0022https:\/\/www.snexplores.org\/article\/explainer-what-is-a-hydrogel\u0022\u003Ehydrogel\u003C\/a\u003E \u2212 a water-based polymer material. This hydrogel muscle changes the shape of a soft lens to alter the lens\u2019s focal length, a mechanism analogous to the \u003Ca href=\u0022https:\/\/www.ncbi.nlm.nih.gov\/books\/NBK470669\/figure\/myopia.F7\/\u0022\u003Eciliary muscles\u003C\/a\u003E in the human eye.\u003C\/p\u003E\u003Cp\u003EThe hydrogel material contracts in response to light, allowing us to control the lens without touching it by projecting light onto its surface. This property also allows us to finely control the shape of the lens by selectively illuminating different parts of the hydrogel. By eliminating rigid optics and structures, our system is flexible and compliant, making it more durable and safer in contact with the body.\u003C\/p\u003E\u003Ch2\u003EWhy it Matters\u003C\/h2\u003E\u003Cp\u003EArtificial vision using cameras is commonplace in a variety of technological systems, including robots and medical tools. The optics needed to form a visual system are still typically restricted to rigid materials using electric power. This limitation presents a challenge for emerging fields, including \u003Ca href=\u0022https:\/\/doi.org\/10.1016\/B978-0-12-801238-3.99907-0\u0022\u003Esoft robotics\u003C\/a\u003E and biomedical tools that integrate soft materials into flexible, low-power and autonomous systems. Our soft lens is particularly suitable for this task.\u003C\/p\u003E\u003Cp\u003ESoft robots are machines made with compliant materials and structures, taking inspiration from animals. This additional flexibility makes them more durable and adaptive. Researchers are using the technology to develop \u003Ca href=\u0022https:\/\/doi.org\/10.1002\/rcs.2010\u0022\u003Esurgical endoscopes\u003C\/a\u003E, grippers for \u003Ca href=\u0022https:\/\/doi.org\/10.1016\/j.sna.2024.115380\u0022\u003Ehandling delicate objects\u003C\/a\u003E and robots for \u003Ca href=\u0022https:\/\/doi.org\/10.1115\/1.4063669\u0022\u003Enavigating environments\u003C\/a\u003E that are difficult for rigid robots.\u003C\/p\u003E\u003Cp\u003EThe same principles apply to biomedical tools. Tissuelike materials can soften the interface between body and machine, making biomedical tools safer by making them move with the body. These include \u003Ca href=\u0022https:\/\/doi.org\/10.1063\/5.0217328\u0022\u003Eskinlike wearable sensors\u003C\/a\u003E and \u003Ca href=\u0022https:\/\/doi.org\/10.1016\/j.cis.2024.103358\u0022\u003Ehydrogel-coated implants\u003C\/a\u003E.\u003C\/p\u003E\u003Cfigure class=\u0022align-center zoomable\u0022\u003E\u003Cp\u003E\u003Ca href=\u0022https:\/\/images.theconversation.com\/files\/697600\/original\/file-20251021-56-2geixz.png?ixlib=rb-4.1.0\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=1000\u0026amp;fit=clip\u0022\u003E\u003Cimg alt=\u0022three photos showing a rubbery disk held between two hands\u0022 src=\u0022https:\/\/images.theconversation.com\/files\/697600\/original\/file-20251021-56-2geixz.png?ixlib=rb-4.1.0\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=754\u0026amp;fit=clip\u0022 srcset=\u0022https:\/\/images.theconversation.com\/files\/697600\/original\/file-20251021-56-2geixz.png?ixlib=rb-4.1.0\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=600\u0026amp;h=191\u0026amp;fit=crop\u0026amp;dpr=1 600w, https:\/\/images.theconversation.com\/files\/697600\/original\/file-20251021-56-2geixz.png?ixlib=rb-4.1.0\u0026amp;q=30\u0026amp;auto=format\u0026amp;w=600\u0026amp;h=191\u0026amp;fit=crop\u0026amp;dpr=2 1200w, https:\/\/images.theconversation.com\/files\/697600\/original\/file-20251021-56-2geixz.png?ixlib=rb-4.1.0\u0026amp;q=15\u0026amp;auto=format\u0026amp;w=600\u0026amp;h=191\u0026amp;fit=crop\u0026amp;dpr=3 1800w, https:\/\/images.theconversation.com\/files\/697600\/original\/file-20251021-56-2geixz.png?ixlib=rb-4.1.0\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=754\u0026amp;h=240\u0026amp;fit=crop\u0026amp;dpr=1 754w, https:\/\/images.theconversation.com\/files\/697600\/original\/file-20251021-56-2geixz.png?ixlib=rb-4.1.0\u0026amp;q=30\u0026amp;auto=format\u0026amp;w=754\u0026amp;h=240\u0026amp;fit=crop\u0026amp;dpr=2 1508w, https:\/\/images.theconversation.com\/files\/697600\/original\/file-20251021-56-2geixz.png?ixlib=rb-4.1.0\u0026amp;q=15\u0026amp;auto=format\u0026amp;w=754\u0026amp;h=240\u0026amp;fit=crop\u0026amp;dpr=3 2262w\u0022 sizes=\u0022(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px\u0022\u003E\u003C\/a\u003E\u003C\/p\u003E\u003Cfigcaption\u003E\u003Cspan class=\u0022caption\u0022\u003EThis variable-focus soft lens, shown viewing a Rubik\u2019s Cube, can flex and twist without being damaged.\u003C\/span\u003E \u003Cspan class=\u0022attribution source\u0022\u003ECorey Zheng\/Georgia Institute of Technology\u003C\/span\u003E\u003C\/figcaption\u003E\u003C\/figure\u003E\u003Ch2\u003EWhat Other Research is Being Done in This Field\u003C\/h2\u003E\u003Cp\u003EThis work merges concepts from \u003Ca href=\u0022https:\/\/doi.org\/10.3389\/frobt.2021.678046\u0022\u003Etunable optics\u003C\/a\u003E and \u003Ca href=\u0022https:\/\/doi.org\/10.1021\/acs.macromol.3c00967\u0022\u003Esoft \u201csmart\u201d materials\u003C\/a\u003E. While these materials are often used to create soft actuators \u2013 parts of machines that move \u2013 such as \u003Ca href=\u0022https:\/\/doi.org\/10.1021\/am507339r\u0022\u003Egrippers\u003C\/a\u003E or \u003Ca href=\u0022https:\/\/doi.org\/10.1126\/scirobotics.aax7112\u0022\u003Epropulsors\u003C\/a\u003E, their application in optical systems has faced challenges.\u003C\/p\u003E\u003Cp\u003EMany existing soft lens designs depend on liquid-filled pouches or actuators \u003Ca href=\u0022https:\/\/doi.org\/10.3389\/frobt.2021.678046\u0022\u003Erequiring electronics\u003C\/a\u003E. These factors can increase complexity or limit their use in delicate or untethered systems. Our light-activated design offers a simpler, electronics-free alternative.\u003C\/p\u003E\u003Ch2\u003EWhat\u2019s Next\u003C\/h2\u003E\u003Cp\u003EWe aim to improve the performance of the system using advances in hydrogel materials. \u003Ca href=\u0022https:\/\/doi.org\/10.3390\/gels11010030\u0022\u003ENew research\u003C\/a\u003E has yielded several types of stimuli-responsive hydrogels with faster and more powerful contraction abilities. We aim to incorporate the latest material developments to improve the physical capabilities of the photo-responsive hydrogel soft lens.\u003C\/p\u003E\u003Cp\u003EWe also aim to show its practical use in new types of camera systems. In our current work, we developed a proof-of-concept, electronics-free camera using our soft lens and a custom light-activated, \u003Ca href=\u0022https:\/\/theconversation.com\/microfluidics-the-tiny-beautiful-tech-hidden-all-around-you-160436\u0022\u003Emicrofluidic chip\u003C\/a\u003E. We plan to incorporate this system into a soft robot to give it electronics-free vision. This system would be a significant demonstration for the potential of our design to enable new types of soft visual sensing.\u003C\/p\u003E\u003Cp\u003E\u003Cem\u003EThe \u003C\/em\u003E\u003Ca href=\u0022https:\/\/theconversation.com\/us\/topics\/research-brief-83231\u0022\u003E\u003Cem\u003EResearch Brief\u003C\/em\u003E\u003C\/a\u003E\u003Cem\u003E is a short take on interesting academic work.\u003C\/em\u003E\u003C!-- Below is The Conversation\u0027s page counter tag. Please DO NOT REMOVE. --\u003E\u003Cimg style=\u0022border-color:!important;border-style:none;box-shadow:none !important;margin:0 !important;max-height:1px !important;max-width:1px !important;min-height:1px !important;min-width:1px !important;opacity:0 !important;outline:none !important;padding:0 !important;\u0022 src=\u0022https:\/\/counter.theconversation.com\/content\/268064\/count.gif?distributor=republish-lightbox-basic\u0022 alt=\u0022The Conversation\u0022 width=\u00221\u0022 height=\u00221\u0022 referrerpolicy=\u0022no-referrer-when-downgrade\u0022\u003E\u003C!-- End of code. If you don\u0027t see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https:\/\/theconversation.com\/republishing-guidelines --\u003E\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cem\u003EThis article is republished from \u003C\/em\u003E\u003Ca href=\u0022https:\/\/theconversation.com\u0022\u003E\u003Cem\u003EThe Conversation\u003C\/em\u003E\u003C\/a\u003E\u003Cem\u003E under a Creative Commons license. Read the \u003C\/em\u003E\u003Ca href=\u0022https:\/\/theconversation.com\/a-flexible-lens-controlled-by-light-activated-artificial-muscles-promises-to-let-soft-machines-see-268064\u0022\u003E\u003Cem\u003Eoriginal article\u003C\/em\u003E\u003C\/a\u003E\u003Cem\u003E.\u003C\/em\u003E\u003C\/p\u003E\u003C\/div\u003E","summary":"","format":"full_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EInspired by the human eye, our biomedical engineering lab at Georgia Tech has designed an adaptive lens made of soft, light-responsive, tissuelike materials.\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Inspired by the human eye, our biomedical engineering lab at Georgia Tech has designed an adaptive lens made of soft, light-responsive, tissuelike materials."}],"uid":"27469","created_gmt":"2025-10-22 16:30:23","changed_gmt":"2026-03-19 13:10:10","author":"Kristen Bailey","boilerplate_text":"","field_publication":"","field_article_url":"","location":"Atlanta, GA","dateline":{"date":"2025-10-22T00:00:00-04:00","iso_date":"2025-10-22T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"678481":{"id":"678481","type":"image","title":"This rubbery disc is an artificial eye that could give soft robots vision. Corey Zheng\/Georgia Institute of Technology","body":"\u003Cp\u003EThis rubbery disc is an artificial eye that could give soft robots vision. Corey Zheng\/Georgia Institute of Technology\u003C\/p\u003E","created":"1761669214","gmt_created":"2025-10-28 16:33:34","changed":"1761669214","gmt_changed":"2025-10-28 16:33:34","alt":"This rubbery disc is an artificial eye that could give soft robots vision. Corey Zheng\/Georgia Institute of Technology","file":{"fid":"262519","name":"file-20251021-66-cq8adm.jpg","image_path":"\/sites\/default\/files\/2025\/10\/28\/file-20251021-66-cq8adm.jpg","image_full_path":"http:\/\/hg.gatech.edu\/\/sites\/default\/files\/2025\/10\/28\/file-20251021-66-cq8adm.jpg","mime":"image\/jpeg","size":226505,"path_740":"http:\/\/hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2025\/10\/28\/file-20251021-66-cq8adm.jpg?itok=6C34XOVb"}}},"media_ids":["678481"],"related_links":[{"url":"https:\/\/theconversation.com\/a-flexible-lens-controlled-by-light-activated-artificial-muscles-promises-to-let-soft-machines-see-268064","title":"Read This Article on The Conversation"}],"groups":[{"id":"658168","name":"Experts"},{"id":"1214","name":"News Room"},{"id":"1188","name":"Research Horizons"}],"categories":[],"keywords":[{"id":"187915","name":"go-researchnews"}],"core_research_areas":[],"news_room_topics":[{"id":"71881","name":"Science and Technology"}],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Ch5\u003EAuthors:\u003C\/h5\u003E\u003Cp\u003E\u003Ca href=\u0022https:\/\/theconversation.com\/profiles\/corey-zheng-2509386\u0022\u003ECorey Zheng\u003C\/a\u003E, PhD Student in Biomedical Engineering, \u003Ca href=\u0022https:\/\/theconversation.com\/institutions\/georgia-institute-of-technology-1310\u0022\u003E\u003Cem\u003EGeorgia Institute of Technology\u003C\/em\u003E\u003C\/a\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Ca href=\u0022https:\/\/theconversation.com\/profiles\/shu-jia-2509377\u0022\u003EShu Jia\u003C\/a\u003E, Assistant Professor of Biomedical Engineering, \u003Ca href=\u0022https:\/\/theconversation.com\/institutions\/georgia-institute-of-technology-1310\u0022\u003E\u003Cem\u003EGeorgia Institute of Technology\u003C\/em\u003E\u003C\/a\u003E\u003C\/p\u003E\u003Ch5\u003EMedia Contact:\u003C\/h5\u003E\u003Cp\u003EShelley Wunder-Smith\u003Cbr\u003E\u003Ca href=\u0022mailto:shelley.wunder-smith@research.gatech.edu\u0022\u003Eshelley.wunder-smith@research.gatech.edu\u003C\/a\u003E\u003C\/p\u003E","format":"limited_html"}],"email":[],"slides":[],"orientation":[],"userdata":""}}}