{"686983":{"#nid":"686983","#data":{"type":"news","title":"Gazing Into the Mind\u2019s Eye With Mice \u2013 How Neuroscientists Are Seeing Human Vision More\u00a0Clearly","body":[{"value":"\u003Cdiv class=\u0022theconversation-article-body\u0022\u003E\u003Cp\u003EDespite the nursery rhyme about three blind mice, \u003Ca href=\u0022https:\/\/doi.org\/10.7554\/eLife.31209\u0022\u003Emouse eyesight is surprisingly sensitive\u003C\/a\u003E. Studying how mice see has helped researchers discover unprecedented details about how individual brain cells communicate and work together to create a mental picture of the visual world.\u003C\/p\u003E\u003Cp\u003E\u003Ca href=\u0022https:\/\/scholar.google.com\/citations?user=P5IKL5UAAAAJ\u0026amp;hl=en\u0022\u003EI am a neuroscientist\u003C\/a\u003E who studies how brain cells drive visual perception and how these processes can fail in conditions \u003Ca href=\u0022https:\/\/doi.org\/10.1093\/cercor\/bhab025\u0022\u003Esuch as autism\u003C\/a\u003E. \u003Ca href=\u0022https:\/\/haider.gatech.edu\/\u0022\u003EMy lab\u003C\/a\u003E \u201clistens\u201d to the electrical activity of neurons in the outermost part of the brain called the cerebral cortex, a \u003Ca href=\u0022https:\/\/doi.org\/10.1523\/JNEUROSCI.17-18-07079.1997\u0022\u003Elarge portion of which\u003C\/a\u003E \u003Ca href=\u0022https:\/\/doi.org\/10.7551\/mitpress\/7131.003.0038\u0022\u003Eprocesses visual information\u003C\/a\u003E. Injuries to the visual cortex can lead to blindness and other visual deficits, even when the eyes themselves are unhurt.\u003C\/p\u003E\u003Cp\u003EUnderstanding the activity of individual neurons \u2013 and how they work together while the brain is actively using and processing information \u2013 is a \u003Ca href=\u0022https:\/\/theconversation.com\/mapping-how-the-100-billion-cells-in-the-brain-all-fit-together-is-the-brave-new-world-of-neuroscience-170182\u0022\u003Elong-standing goal of neuroscience\u003C\/a\u003E. Researchers have moved much closer to achieving this goal thanks to new technologies aimed at the mouse visual system. And these findings will help scientists better see how the visual systems of people work.\u003C\/p\u003E\u003Ch2\u003EThe Mind in the Blink of an Eye\u003C\/h2\u003E\u003Cp\u003EResearchers long thought that vision in mice appeared \u003Ca href=\u0022https:\/\/doi.org\/10.1016\/s0042-6989(00)00081-x\u0022\u003Esluggish with low clarity\u003C\/a\u003E. But it turns out visual cortex neurons in mice \u2013 just like \u003Ca href=\u0022https:\/\/doi.org\/10.1016\/j.pneurobio.2024.102656\u0022\u003Ethose in humans, monkeys, cats and ferrets\u003C\/a\u003E \u2013 require \u003Ca href=\u0022https:\/\/doi.org\/10.1523\/JNEUROSCI.0623-08.2008\u0022\u003Especific visual features to trigger activity\u003C\/a\u003E and are particularly \u003Ca href=\u0022https:\/\/doi.org\/10.1038\/nature11665\u0022\u003Eselective in alert and awake conditions\u003C\/a\u003E.\u003C\/p\u003E\u003Cp\u003EMy colleagues and I and others have found that \u003Ca href=\u0022https:\/\/doi.org\/10.1038\/s41467-021-24311-5\u0022\u003Emice are especially sensitive to\u003C\/a\u003E \u003Ca href=\u0022https:\/\/doi.org\/10.1038\/s41467-021-24311-5\u0022\u003Evisual stimuli directly in front of them\u003C\/a\u003E. This is surprising, because mouse eyes face outward rather than forward. Forward-facing eyes, like those of cats and primates, naturally have a larger area of focus straight ahead compared to outward-facing eyes.\u003C\/p\u003E\u003Cfigure class=\u0022align-center zoomable\u0022\u003E\u003Cp\u003E\u003Ca href=\u0022https:\/\/images.theconversation.com\/files\/708514\/original\/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0\u0026amp;rect=0%2C0%2C2048%2C1787\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=1000\u0026amp;fit=clip\u0022\u003E\u003Cimg alt=\u0022Microscopy image of stacks of neurons\u0022 src=\u0022https:\/\/images.theconversation.com\/files\/708514\/original\/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0\u0026amp;rect=0%2C0%2C2048%2C1787\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=754\u0026amp;fit=clip\u0022 srcset=\u0022https:\/\/images.theconversation.com\/files\/708514\/original\/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=600\u0026amp;h=524\u0026amp;fit=crop\u0026amp;dpr=1 600w, https:\/\/images.theconversation.com\/files\/708514\/original\/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0\u0026amp;q=30\u0026amp;auto=format\u0026amp;w=600\u0026amp;h=524\u0026amp;fit=crop\u0026amp;dpr=2 1200w, https:\/\/images.theconversation.com\/files\/708514\/original\/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0\u0026amp;q=15\u0026amp;auto=format\u0026amp;w=600\u0026amp;h=524\u0026amp;fit=crop\u0026amp;dpr=3 1800w, https:\/\/images.theconversation.com\/files\/708514\/original\/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=754\u0026amp;h=658\u0026amp;fit=crop\u0026amp;dpr=1 754w, https:\/\/images.theconversation.com\/files\/708514\/original\/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0\u0026amp;q=30\u0026amp;auto=format\u0026amp;w=754\u0026amp;h=658\u0026amp;fit=crop\u0026amp;dpr=2 1508w, https:\/\/images.theconversation.com\/files\/708514\/original\/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0\u0026amp;q=15\u0026amp;auto=format\u0026amp;w=754\u0026amp;h=658\u0026amp;fit=crop\u0026amp;dpr=3 2262w\u0022 sizes=\u0022(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px\u0022\u003E\u003C\/a\u003E\u003C\/p\u003E\u003Cfigcaption\u003E\u003Cspan class=\u0022caption\u0022\u003EThis image shows neurons in the mouse retina: cone photoreceptors (red), bipolar neurons (magenta), and a subtype of bipolar neuron (green).\u003C\/span\u003E \u003Ca class=\u0022source\u0022 href=\u0022https:\/\/www.flickr.com\/photos\/nihgov\/35882593476\/\u0022\u003E\u003Cspan class=\u0022attribution\u0022\u003EBrian Liu and Melanie Samuel\/Baylor College of Medicine\/NIH via Flickr\u003C\/span\u003E\u003C\/a\u003E\u003C\/figcaption\u003E\u003C\/figure\u003E\u003Cp\u003EThis finding suggests that the specialization of the visual system to highlight the frontal visual field appears to be \u003Ca href=\u0022https:\/\/doi.org\/10.1038\/361719a0\u0022\u003Eshared between mice and humans\u003C\/a\u003E. For mice, a visual focus on what\u2019s straight ahead may help them be more \u003Ca href=\u0022https:\/\/doi.org\/10.1016\/j.cub.2021.06.094\u0022\u003Eresponsive to shadows or edges\u003C\/a\u003E in front of them, helping them avoid looming predators or better \u003Ca href=\u0022https:\/\/doi.org\/10.1016\/j.neuron.2021.03.010\u0022\u003Ehunt and capture insects for food\u003C\/a\u003E.\u003C\/p\u003E\u003Cp\u003EImportantly, the center of view is \u003Ca href=\u0022https:\/\/doi.org\/10.3390\/jcm14155266\u0022\u003Emost affected in aging and many visual diseases\u003C\/a\u003E in people. Since mice also rely heavily on this part of the visual field, they may be particularly useful models to study and treat visual impairment.\u003C\/p\u003E\u003Ch2\u003EA Thousand Voices Drive Complicated Choices\u003C\/h2\u003E\u003Cp\u003EAdvances in technology have greatly accelerated scientific understanding of vision and the brain. Researchers can now routinely record the activity of thousands of neurons at the same time and pair this data with real-time video of a mouse\u2019s face, pupil and body movements. This method can \u003Ca href=\u0022https:\/\/doi.org\/10.1126\/science.aav7893\u0022\u003Eshow how behavior interacts with brain activity\u003C\/a\u003E.\u003C\/p\u003E\u003Cp\u003EIt\u2019s like spending years listening to a grainy recording of a symphony with one featured soloist, but now you have a pristine recording where you can hear every single musician with a note-by-note readout of every single finger movement.\u003C\/p\u003E\u003Cp\u003EUsing these improved methods, researchers like me are studying how specific types of neurons work together during complex visual behaviors. This involves analyzing how factors such as movement, alertness and the environment influence visual activity in the brain.\u003C\/p\u003E\u003Cp\u003EFor example, my lab and I found that the speed of visual signaling is \u003Ca href=\u0022https:\/\/doi.org\/10.1016\/j.cub.2025.02.009\u0022\u003Ehighly sensitive to what actions are possible\u003C\/a\u003E in the physical environment. If a mouse rests on a disc that permits running, visual signals travel to the cortex faster than if the mouse views the same images while resting in a stationary tube \u2013 even when the mouse is totally still in both conditions.\u003C\/p\u003E\u003Cp\u003EIn order to connect electrical activity to visual perception, researchers also have to ask a mouse what it thinks it sees. How have we done this?\u003C\/p\u003E\u003Cp\u003EThe last decade has seen researchers debunking long-standing \u003Ca href=\u0022https:\/\/doi.org\/10.3389\/fnsys.2014.00173\u0022\u003Emyths about mouse learning and behavior\u003C\/a\u003E. Like other rodents, mice are also \u003Ca href=\u0022https:\/\/theconversation.com\/im-a-neuroscientist-who-taught-rats-to-drive-their-joy-suggests-how-anticipating-fun-can-enrich-human-life-239029\u0022\u003Esurprisingly clever\u003C\/a\u003E and can learn how to \u201ctell\u201d researchers about the visual events they perceive through their behavior.\u003C\/p\u003E\u003Cp\u003EFor example, mice can \u003Ca href=\u0022https:\/\/doi.org\/10.1523\/jneurosci.3560-13.2013\u0022\u003Elearn to release a lever\u003C\/a\u003E to indicate they have detected that a pattern has brightened or tilted. They can \u003Ca href=\u0022https:\/\/doi.org\/10.1016\/j.celrep.2017.08.047\u0022\u003Erotate a Lego wheel left or right\u003C\/a\u003E to move a visual stimulus to the center of a screen like a video game, and they can \u003Ca href=\u0022https:\/\/doi.org\/10.7554\/eLife.50340\u0022\u003Estop running on a wheel\u003C\/a\u003E \u003Ca href=\u0022https:\/\/doi.org\/10.3389\/fnbeh.2020.00104\u0022\u003Eand lick a water spout\u003C\/a\u003E when they detect the visual scene has suddenly changed.\u003C\/p\u003E\u003Cfigure class=\u0022align-center zoomable\u0022\u003E\u003Cp\u003E\u003Ca href=\u0022https:\/\/images.theconversation.com\/files\/708526\/original\/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=1000\u0026amp;fit=clip\u0022\u003E\u003Cimg alt=\u0022Mouse drinking from a metal water spout\u0022 src=\u0022https:\/\/images.theconversation.com\/files\/708526\/original\/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=754\u0026amp;fit=clip\u0022 srcset=\u0022https:\/\/images.theconversation.com\/files\/708526\/original\/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=600\u0026amp;h=400\u0026amp;fit=crop\u0026amp;dpr=1 600w, https:\/\/images.theconversation.com\/files\/708526\/original\/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0\u0026amp;q=30\u0026amp;auto=format\u0026amp;w=600\u0026amp;h=400\u0026amp;fit=crop\u0026amp;dpr=2 1200w, https:\/\/images.theconversation.com\/files\/708526\/original\/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0\u0026amp;q=15\u0026amp;auto=format\u0026amp;w=600\u0026amp;h=400\u0026amp;fit=crop\u0026amp;dpr=3 1800w, https:\/\/images.theconversation.com\/files\/708526\/original\/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0\u0026amp;q=45\u0026amp;auto=format\u0026amp;w=754\u0026amp;h=503\u0026amp;fit=crop\u0026amp;dpr=1 754w, https:\/\/images.theconversation.com\/files\/708526\/original\/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0\u0026amp;q=30\u0026amp;auto=format\u0026amp;w=754\u0026amp;h=503\u0026amp;fit=crop\u0026amp;dpr=2 1508w, https:\/\/images.theconversation.com\/files\/708526\/original\/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0\u0026amp;q=15\u0026amp;auto=format\u0026amp;w=754\u0026amp;h=503\u0026amp;fit=crop\u0026amp;dpr=3 2262w\u0022 sizes=\u0022(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px\u0022\u003E\u003C\/a\u003E\u003C\/p\u003E\u003Cfigcaption\u003E\u003Cspan class=\u0022caption\u0022\u003EMice can be trained to drink water as a way to \u2018tell\u2019 researchers they see something.\u003C\/span\u003E \u003Ca class=\u0022source\u0022 href=\u0022https:\/\/www.gettyimages.com\/detail\/photo\/mouse-drinking-from-a-spout-royalty-free-image\/178825439\u0022\u003E\u003Cspan class=\u0022attribution\u0022\u003Efelixmizioznikov\/iStock via Getty Images Plus\u003C\/span\u003E\u003C\/a\u003E\u003C\/figcaption\u003E\u003C\/figure\u003E\u003Cp\u003EMice can also use visual cues to \u003Ca href=\u0022https:\/\/doi.org\/10.1016\/j.cub.2018.01.038\u0022\u003Efocus their visual processing\u003C\/a\u003E to specific parts of the visual field. As a result, they can more quickly and accurately respond to visual stimuli that appear in those regions. For example, my team and I found that a faint visual image in the peripheral visual field is difficult for mice to detect. But once they do notice it \u2013 and tell us by licking a water spout \u2013 their subsequent responses are \u003Ca href=\u0022https:\/\/doi.org\/10.1038\/s41467-020-14355-4\u0022\u003Efaster and more accurate\u003C\/a\u003E.\u003C\/p\u003E\u003Cp\u003EThese improvements come at a cost: If the image unexpectedly appears in a different location, the mice are slower and less likely to respond to it. These findings resemble those found in studies on \u003Ca href=\u0022https:\/\/doi.org\/10.1080\/00335558008248231\u0022\u003Espatial attention in people\u003C\/a\u003E.\u003C\/p\u003E\u003Cp\u003EMy lab has also found that \u003Ca href=\u0022https:\/\/doi.org\/10.1038\/s41593-025-01888-4\u0022\u003Eparticular types of inhibitory neurons\u003C\/a\u003E \u2013 brain cells that prevent activity from spreading \u2013 strongly control the strength of visual signals. When we activated certain inhibitory neurons in the visual cortex of mice, we could effectively \u201cerase\u201d their perception of an image.\u003C\/p\u003E\u003Cp\u003EThese kinds of experiments are also revealing that the boundaries between perception and action in the brain are \u003Ca href=\u0022https:\/\/doi.org\/10.1038\/s41593-025-02114-x\u0022\u003Emuch less separate than once thought\u003C\/a\u003E. This means that visual neurons will respond differently to the same image in ways that depend on behavioral circumstances \u2013 for example, visual responses differ if the image will be \u003Ca href=\u0022https:\/\/doi.org\/10.1038\/s41586-019-1787-x\u0022\u003Esuccessfully detected\u003C\/a\u003E, if it appears \u003Ca href=\u0022https:\/\/doi.org\/10.1016\/j.neuron.2025.06.001\u0022\u003Ewhile the mouse is moving\u003C\/a\u003E, or if it appears \u003Ca href=\u0022https:\/\/doi.org\/10.1126\/science.aav3932\u0022\u003Ewhen the mouse is thirsty or hydrated\u003C\/a\u003E.\u003C\/p\u003E\u003Cp\u003EUnderstanding how different factors shape how cortical neurons rapidly respond to visual images will require advances in computational tools that can separate the contribution of these behavioral signals from the visual ones. Researchers also need technologies that can isolate how specific types of brain cells carry and communicate these signals.\u003C\/p\u003E\u003Ch2\u003EData Clouds Encircling the Globe\u003C\/h2\u003E\u003Cp\u003EThis surge of research on the mouse visual system has led to a significant increase in the amount of data that scientists can not only gather in a single experiment but also publicly share among each other.\u003C\/p\u003E\u003Cp\u003EMajor national and international research centers focused on \u003Ca href=\u0022https:\/\/brain-map.org\/\u0022\u003Eunraveling the circuitry of the mouse visual system\u003C\/a\u003E have been leading the charge in ushering in new optical, electrical and biological \u003Ca href=\u0022https:\/\/www.internationalbrainlab.com\/\u0022\u003Etools to measure large numbers of visual neurons\u003C\/a\u003E in action. Moreover, they make \u003Ca href=\u0022https:\/\/brain-map.org\/atlases#mouse\u0022\u003Eall the data publicly available\u003C\/a\u003E, inspiring \u003Ca href=\u0022https:\/\/mouse.digital-brain.cn\/projectome\/pfc\u0022\u003Esimilar efforts around the globe\u003C\/a\u003E. This collaboration accelerates the ability of researchers to analyze data, replicate findings and make new discoveries.\u003C\/p\u003E\u003Cp\u003ETechnological advances in data collection and sharing can make the culture of scientific discovery more efficient and transparent \u2013 a major \u003Ca href=\u0022https:\/\/doi.org\/10.3389\/fninf.2023.1276407\u0022\u003Edata informatics goal\u003C\/a\u003E of neuroscience in the years ahead.\u003C\/p\u003E\u003Cp\u003EIf the past 10 years are anything to go by, I believe such discoveries are just the tip of the iceberg, and the mighty and not-so-blind mouse will play a leading role in the continuing quest to understand the mysteries of the human brain.\u003C!-- Below is The Conversation\u0027s page counter tag. Please DO NOT REMOVE. --\u003E\u003Cimg style=\u0022border-color:!important;border-style:none;box-shadow:none !important;margin:0 !important;max-height:1px !important;max-width:1px !important;min-height:1px !important;min-width:1px !important;opacity:0 !important;outline:none !important;padding:0 !important;\u0022 src=\u0022https:\/\/counter.theconversation.com\/content\/268334\/count.gif?distributor=republish-lightbox-basic\u0022 alt=\u0022The Conversation\u0022 width=\u00221\u0022 height=\u00221\u0022 referrerpolicy=\u0022no-referrer-when-downgrade\u0022\u003E\u003C!-- End of code. If you don\u0027t see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https:\/\/theconversation.com\/republishing-guidelines --\u003E\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cem\u003EThis article is republished from \u003C\/em\u003E\u003Ca href=\u0022https:\/\/theconversation.com\u0022\u003E\u003Cem\u003EThe Conversation\u003C\/em\u003E\u003C\/a\u003E\u003Cem\u003E under a Creative Commons license. Read the \u003C\/em\u003E\u003Ca href=\u0022https:\/\/theconversation.com\/gazing-into-the-minds-eye-with-mice-how-neuroscientists-are-seeing-human-vision-more-clearly-268334\u0022\u003E\u003Cem\u003Eoriginal article\u003C\/em\u003E\u003C\/a\u003E\u003Cem\u003E.\u003C\/em\u003E\u003C\/p\u003E\u003C\/div\u003E","summary":"","format":"full_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EStudying how mice see has helped researchers discover unprecedented details about how individual brain cells communicate and work together to create a mental picture of the visual world.\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Studying how mice see has helped researchers discover unprecedented details about how individual brain cells communicate and work together to create a mental picture of the visual world."}],"uid":"27469","created_gmt":"2025-12-16 13:42:12","changed_gmt":"2026-01-21 19:21:40","author":"Kristen Bailey","boilerplate_text":"","field_publication":"","field_article_url":"","location":"Atlanta, GA","dateline":{"date":"2025-12-16T00:00:00-05:00","iso_date":"2025-12-16T00:00:00-05:00","tz":"America\/New_York"},"extras":[],"hg_media":{"678887":{"id":"678887","type":"image","title":" Mice have complex visual systems that can clarify how vision works in people. Westend61\/Getty Images","body":"\u003Cdiv\u003E\u003Cp\u003EMice have complex visual systems that can clarify how vision works in people. \u003Ca href=\u0022https:\/\/www.gettyimages.com\/detail\/photo\/germany-research-laboratory-mouse-climbing-out-of-royalty-free-image\/544546223\u0022\u003EWestend61\/Getty Images\u003C\/a\u003E\u003C\/p\u003E\u003C\/div\u003E","created":"1766065654","gmt_created":"2025-12-18 13:47:34","changed":"1766065654","gmt_changed":"2025-12-18 13:47:34","alt":" Mice have complex visual systems that can clarify how vision works in people. Westend61\/Getty Images","file":{"fid":"262977","name":"file-20251213-56-fdaib6.jpg","image_path":"\/sites\/default\/files\/2025\/12\/18\/file-20251213-56-fdaib6.jpg","image_full_path":"http:\/\/hg.gatech.edu\/\/sites\/default\/files\/2025\/12\/18\/file-20251213-56-fdaib6.jpg","mime":"image\/jpeg","size":80137,"path_740":"http:\/\/hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2025\/12\/18\/file-20251213-56-fdaib6.jpg?itok=21uzzcB5"}}},"media_ids":["678887"],"related_links":[{"url":"https:\/\/theconversation.com\/gazing-into-the-minds-eye-with-mice-how-neuroscientists-are-seeing-human-vision-more-clearly-268334","title":"Read This Article on The Conversation"}],"groups":[{"id":"66220","name":"Neuro"},{"id":"1292","name":"Parker H. Petit Institute for Bioengineering and Bioscience (IBB)"},{"id":"1188","name":"Research Horizons"}],"categories":[{"id":"138","name":"Biotechnology, Health, Bioengineering, Genetics"}],"keywords":[{"id":"187915","name":"go-researchnews"},{"id":"187423","name":"go-bio"},{"id":"172970","name":"go-neuro"}],"core_research_areas":[{"id":"39441","name":"Bioengineering and Bioscience"}],"news_room_topics":[{"id":"71881","name":"Science and Technology"}],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Ch5\u003EAuthor:\u003C\/h5\u003E\u003Cp\u003E\u003Ca href=\u0022https:\/\/theconversation.com\/profiles\/bilal-haider-2512267\u0022\u003EBilal Haider\u003C\/a\u003E, Associate Professor of Biomedical Engineering, \u003Ca href=\u0022https:\/\/theconversation.com\/institutions\/georgia-institute-of-technology-1310\u0022\u003E\u003Cem\u003EGeorgia Institute of Technology\u003C\/em\u003E\u003C\/a\u003E\u003C\/p\u003E\u003Ch5\u003EMedia Contact:\u003C\/h5\u003E\u003Cp\u003EShelley Wunder-Smith\u003Cbr\u003E\u003Ca href=\u0022mailto:shelley.wunder-smith@research.gatech.edu\u0022\u003Eshelley.wunder-smith@research.gatech.edu\u003C\/a\u003E\u003C\/p\u003E","format":"limited_html"}],"email":[],"slides":[],"orientation":[],"userdata":""}}}