<node id="686983">
  <nid>686983</nid>
  <type>news</type>
  <uid>
    <user id="27469"><![CDATA[27469]]></user>
  </uid>
  <created>1765892532</created>
  <changed>1769023300</changed>
  <title><![CDATA[Gazing Into the Mind’s Eye With Mice – How Neuroscientists Are Seeing Human Vision More Clearly]]></title>
  <body><![CDATA[<div class="theconversation-article-body"><p>Despite the nursery rhyme about three blind mice, <a href="https://doi.org/10.7554/eLife.31209">mouse eyesight is surprisingly sensitive</a>. Studying how mice see has helped researchers discover unprecedented details about how individual brain cells communicate and work together to create a mental picture of the visual world.</p><p><a href="https://scholar.google.com/citations?user=P5IKL5UAAAAJ&amp;hl=en">I am a neuroscientist</a> who studies how brain cells drive visual perception and how these processes can fail in conditions <a href="https://doi.org/10.1093/cercor/bhab025">such as autism</a>. <a href="https://haider.gatech.edu/">My lab</a> “listens” to the electrical activity of neurons in the outermost part of the brain called the cerebral cortex, a <a href="https://doi.org/10.1523/JNEUROSCI.17-18-07079.1997">large portion of which</a> <a href="https://doi.org/10.7551/mitpress/7131.003.0038">processes visual information</a>. Injuries to the visual cortex can lead to blindness and other visual deficits, even when the eyes themselves are unhurt.</p><p>Understanding the activity of individual neurons – and how they work together while the brain is actively using and processing information – is a <a href="https://theconversation.com/mapping-how-the-100-billion-cells-in-the-brain-all-fit-together-is-the-brave-new-world-of-neuroscience-170182">long-standing goal of neuroscience</a>. Researchers have moved much closer to achieving this goal thanks to new technologies aimed at the mouse visual system. And these findings will help scientists better see how the visual systems of people work.</p><h2>The Mind in the Blink of an Eye</h2><p>Researchers long thought that vision in mice appeared <a href="https://doi.org/10.1016/s0042-6989(00)00081-x">sluggish with low clarity</a>. But it turns out visual cortex neurons in mice – just like <a href="https://doi.org/10.1016/j.pneurobio.2024.102656">those in humans, monkeys, cats and ferrets</a> – require <a href="https://doi.org/10.1523/JNEUROSCI.0623-08.2008">specific visual features to trigger activity</a> and are particularly <a href="https://doi.org/10.1038/nature11665">selective in alert and awake conditions</a>.</p><p>My colleagues and I and others have found that <a href="https://doi.org/10.1038/s41467-021-24311-5">mice are especially sensitive to</a> <a href="https://doi.org/10.1038/s41467-021-24311-5">visual stimuli directly in front of them</a>. This is surprising, because mouse eyes face outward rather than forward. Forward-facing eyes, like those of cats and primates, naturally have a larger area of focus straight ahead compared to outward-facing eyes.</p><figure class="align-center zoomable"><p><a href="https://images.theconversation.com/files/708514/original/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0&amp;rect=0%2C0%2C2048%2C1787&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="Microscopy image of stacks of neurons" src="https://images.theconversation.com/files/708514/original/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0&amp;rect=0%2C0%2C2048%2C1787&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/708514/original/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=524&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/708514/original/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=524&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/708514/original/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=524&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/708514/original/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=658&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/708514/original/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=658&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/708514/original/file-20251212-56-z8h8ny.jpg?ixlib=rb-4.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=658&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a></p><figcaption><span class="caption">This image shows neurons in the mouse retina: cone photoreceptors (red), bipolar neurons (magenta), and a subtype of bipolar neuron (green).</span> <a class="source" href="https://www.flickr.com/photos/nihgov/35882593476/"><span class="attribution">Brian Liu and Melanie Samuel/Baylor College of Medicine/NIH via Flickr</span></a></figcaption></figure><p>This finding suggests that the specialization of the visual system to highlight the frontal visual field appears to be <a href="https://doi.org/10.1038/361719a0">shared between mice and humans</a>. For mice, a visual focus on what’s straight ahead may help them be more <a href="https://doi.org/10.1016/j.cub.2021.06.094">responsive to shadows or edges</a> in front of them, helping them avoid looming predators or better <a href="https://doi.org/10.1016/j.neuron.2021.03.010">hunt and capture insects for food</a>.</p><p>Importantly, the center of view is <a href="https://doi.org/10.3390/jcm14155266">most affected in aging and many visual diseases</a> in people. Since mice also rely heavily on this part of the visual field, they may be particularly useful models to study and treat visual impairment.</p><h2>A Thousand Voices Drive Complicated Choices</h2><p>Advances in technology have greatly accelerated scientific understanding of vision and the brain. Researchers can now routinely record the activity of thousands of neurons at the same time and pair this data with real-time video of a mouse’s face, pupil and body movements. This method can <a href="https://doi.org/10.1126/science.aav7893">show how behavior interacts with brain activity</a>.</p><p>It’s like spending years listening to a grainy recording of a symphony with one featured soloist, but now you have a pristine recording where you can hear every single musician with a note-by-note readout of every single finger movement.</p><p>Using these improved methods, researchers like me are studying how specific types of neurons work together during complex visual behaviors. This involves analyzing how factors such as movement, alertness and the environment influence visual activity in the brain.</p><p>For example, my lab and I found that the speed of visual signaling is <a href="https://doi.org/10.1016/j.cub.2025.02.009">highly sensitive to what actions are possible</a> in the physical environment. If a mouse rests on a disc that permits running, visual signals travel to the cortex faster than if the mouse views the same images while resting in a stationary tube – even when the mouse is totally still in both conditions.</p><p>In order to connect electrical activity to visual perception, researchers also have to ask a mouse what it thinks it sees. How have we done this?</p><p>The last decade has seen researchers debunking long-standing <a href="https://doi.org/10.3389/fnsys.2014.00173">myths about mouse learning and behavior</a>. Like other rodents, mice are also <a href="https://theconversation.com/im-a-neuroscientist-who-taught-rats-to-drive-their-joy-suggests-how-anticipating-fun-can-enrich-human-life-239029">surprisingly clever</a> and can learn how to “tell” researchers about the visual events they perceive through their behavior.</p><p>For example, mice can <a href="https://doi.org/10.1523/jneurosci.3560-13.2013">learn to release a lever</a> to indicate they have detected that a pattern has brightened or tilted. They can <a href="https://doi.org/10.1016/j.celrep.2017.08.047">rotate a Lego wheel left or right</a> to move a visual stimulus to the center of a screen like a video game, and they can <a href="https://doi.org/10.7554/eLife.50340">stop running on a wheel</a> <a href="https://doi.org/10.3389/fnbeh.2020.00104">and lick a water spout</a> when they detect the visual scene has suddenly changed.</p><figure class="align-center zoomable"><p><a href="https://images.theconversation.com/files/708526/original/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip"><img alt="Mouse drinking from a metal water spout" src="https://images.theconversation.com/files/708526/original/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" srcset="https://images.theconversation.com/files/708526/original/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/708526/original/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/708526/original/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/708526/original/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/708526/original/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/708526/original/file-20251212-56-ccqnav.jpg?ixlib=rb-4.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a></p><figcaption><span class="caption">Mice can be trained to drink water as a way to ‘tell’ researchers they see something.</span> <a class="source" href="https://www.gettyimages.com/detail/photo/mouse-drinking-from-a-spout-royalty-free-image/178825439"><span class="attribution">felixmizioznikov/iStock via Getty Images Plus</span></a></figcaption></figure><p>Mice can also use visual cues to <a href="https://doi.org/10.1016/j.cub.2018.01.038">focus their visual processing</a> to specific parts of the visual field. As a result, they can more quickly and accurately respond to visual stimuli that appear in those regions. For example, my team and I found that a faint visual image in the peripheral visual field is difficult for mice to detect. But once they do notice it – and tell us by licking a water spout – their subsequent responses are <a href="https://doi.org/10.1038/s41467-020-14355-4">faster and more accurate</a>.</p><p>These improvements come at a cost: If the image unexpectedly appears in a different location, the mice are slower and less likely to respond to it. These findings resemble those found in studies on <a href="https://doi.org/10.1080/00335558008248231">spatial attention in people</a>.</p><p>My lab has also found that <a href="https://doi.org/10.1038/s41593-025-01888-4">particular types of inhibitory neurons</a> – brain cells that prevent activity from spreading – strongly control the strength of visual signals. When we activated certain inhibitory neurons in the visual cortex of mice, we could effectively “erase” their perception of an image.</p><p>These kinds of experiments are also revealing that the boundaries between perception and action in the brain are <a href="https://doi.org/10.1038/s41593-025-02114-x">much less separate than once thought</a>. This means that visual neurons will respond differently to the same image in ways that depend on behavioral circumstances – for example, visual responses differ if the image will be <a href="https://doi.org/10.1038/s41586-019-1787-x">successfully detected</a>, if it appears <a href="https://doi.org/10.1016/j.neuron.2025.06.001">while the mouse is moving</a>, or if it appears <a href="https://doi.org/10.1126/science.aav3932">when the mouse is thirsty or hydrated</a>.</p><p>Understanding how different factors shape how cortical neurons rapidly respond to visual images will require advances in computational tools that can separate the contribution of these behavioral signals from the visual ones. Researchers also need technologies that can isolate how specific types of brain cells carry and communicate these signals.</p><h2>Data Clouds Encircling the Globe</h2><p>This surge of research on the mouse visual system has led to a significant increase in the amount of data that scientists can not only gather in a single experiment but also publicly share among each other.</p><p>Major national and international research centers focused on <a href="https://brain-map.org/">unraveling the circuitry of the mouse visual system</a> have been leading the charge in ushering in new optical, electrical and biological <a href="https://www.internationalbrainlab.com/">tools to measure large numbers of visual neurons</a> in action. Moreover, they make <a href="https://brain-map.org/atlases#mouse">all the data publicly available</a>, inspiring <a href="https://mouse.digital-brain.cn/projectome/pfc">similar efforts around the globe</a>. This collaboration accelerates the ability of researchers to analyze data, replicate findings and make new discoveries.</p><p>Technological advances in data collection and sharing can make the culture of scientific discovery more efficient and transparent – a major <a href="https://doi.org/10.3389/fninf.2023.1276407">data informatics goal</a> of neuroscience in the years ahead.</p><p>If the past 10 years are anything to go by, I believe such discoveries are just the tip of the iceberg, and the mighty and not-so-blind mouse will play a leading role in the continuing quest to understand the mysteries of the human brain.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img style="border-color:!important;border-style:none;box-shadow:none !important;margin:0 !important;max-height:1px !important;max-width:1px !important;min-height:1px !important;min-width:1px !important;opacity:0 !important;outline:none !important;padding:0 !important;" src="https://counter.theconversation.com/content/268334/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" referrerpolicy="no-referrer-when-downgrade"><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p><p>&nbsp;</p><p><em>This article is republished from </em><a href="https://theconversation.com"><em>The Conversation</em></a><em> under a Creative Commons license. Read the </em><a href="https://theconversation.com/gazing-into-the-minds-eye-with-mice-how-neuroscientists-are-seeing-human-vision-more-clearly-268334"><em>original article</em></a><em>.</em></p></div>]]></body>
  <field_subtitle>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_subtitle>
  <field_dateline>
    <item>
      <value>2025-12-16T00:00:00-05:00</value>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_dateline>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Studying how mice see has helped researchers discover unprecedented details about how individual brain cells communicate and work together to create a mental picture of the visual world.]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>Studying how mice see has helped researchers discover unprecedented details about how individual brain cells communicate and work together to create a mental picture of the visual world.</p>]]></value>
    </item>
  </field_summary>
  <field_media>
          <item>
        <nid>
          <node id="678887">
            <nid>678887</nid>
            <type>image</type>
            <title><![CDATA[ Mice have complex visual systems that can clarify how vision works in people. Westend61/Getty Images]]></title>
            <body><![CDATA[<div><p>Mice have complex visual systems that can clarify how vision works in people. <a href="https://www.gettyimages.com/detail/photo/germany-research-laboratory-mouse-climbing-out-of-royalty-free-image/544546223">Westend61/Getty Images</a></p></div>]]></body>
                          <field_image>
                <item>
                  <fid>262977</fid>
                  <filename><![CDATA[file-20251213-56-fdaib6.jpg]]></filename>
                  <filepath><![CDATA[/sites/default/files/2025/12/18/file-20251213-56-fdaib6.jpg]]></filepath>
                  <file_full_path><![CDATA[http://hg.gatech.edu//sites/default/files/2025/12/18/file-20251213-56-fdaib6.jpg]]></file_full_path>
                  <filemime>image/jpeg</filemime>
                  <image_740><![CDATA[]]></image_740>
                  <image_alt><![CDATA[ Mice have complex visual systems that can clarify how vision works in people. Westend61/Getty Images]]></image_alt>
                </item>
              </field_image>
            
                      </node>
        </nid>
      </item>
      </field_media>
  <field_contact_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_contact_email>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_contact>
    <item>
      <value><![CDATA[<h5>Author:</h5><p><a href="https://theconversation.com/profiles/bilal-haider-2512267">Bilal Haider</a>, Associate Professor of Biomedical Engineering, <a href="https://theconversation.com/institutions/georgia-institute-of-technology-1310"><em>Georgia Institute of Technology</em></a></p><h5>Media Contact:</h5><p>Shelley Wunder-Smith<br><a href="mailto:shelley.wunder-smith@research.gatech.edu">shelley.wunder-smith@research.gatech.edu</a></p>]]></value>
    </item>
  </field_contact>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <!--  TO DO: correct to not conflate categories and news room topics  -->
  <!--  Disquisition: it's funny how I write these TODOs and then never
         revisit them. It's as though the act of writing the thing down frees me
         from the responsibility to actually solve the problem. But what can I
         say? There are more problems than there's time to solve.  -->
  <links_related> </links_related>
  <files> </files>
  <og_groups>
          <item>66220</item>
          <item>1292</item>
          <item>1188</item>
      </og_groups>
  <og_groups_both>
          <item>
        <![CDATA[Biotechnology, Health, Bioengineering, Genetics]]>
      </item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>138</tid>
        <value><![CDATA[Biotechnology, Health, Bioengineering, Genetics]]></value>
      </item>
      </field_categories>
  <core_research_areas>
          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>
      </core_research_areas>
  <field_news_room_topics>
          <item>
        <tid>71881</tid>
        <value><![CDATA[Science and Technology]]></value>
      </item>
      </field_news_room_topics>
  <links_related>
          <link>
      <url>https://theconversation.com/gazing-into-the-minds-eye-with-mice-how-neuroscientists-are-seeing-human-vision-more-clearly-268334</url>
      <title></title>
      </link>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>66220</item>
          <item>1292</item>
          <item>1188</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Neuro]]></item>
          <item><![CDATA[Parker H. Petit Institute for Bioengineering and Bioscience (IBB)]]></item>
          <item><![CDATA[Research Horizons]]></item>
      </og_groups_both>
  <field_keywords>
          <item>
        <tid>187915</tid>
        <value><![CDATA[go-researchnews]]></value>
      </item>
          <item>
        <tid>187423</tid>
        <value><![CDATA[go-bio]]></value>
      </item>
          <item>
        <tid>172970</tid>
        <value><![CDATA[go-neuro]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
