Ph.D. Defense by Sean Kelly

Primary tabs

Sean Kelly
PhD Defense Presentation
Date: Monday September 22nd, 2014
Time: 9:00 - 10:00am
Location: Sustainable Education Building, Room 316 (GT Campus)

Committee Members:
Advisor: Garrett Stanley, PhD (Georgia Institute of Technology)
Jose-Manuel Alonso, MD/PhD (State University of New York)
Robert Butera, PhD (Georgia Institute of Technology)
Robert Liu, PhD (Emory University)
Chris Rozell, PhD (Georgia Institute of Technology)

Title: "Neural Population Coding of Visual Motion"


Motion in the outside world forms one of the primary uses of visual information for many animals. The ability to interpret motion quickly and accurately permits interaction with and response to events in the outside world. While much is known about some aspects of motion perception, there is less agreement about how feature selectivity leading to motion perception is actually formed in the convergent and divergent pathways of the visual system. It is even less clear how these classical understandings of motion processing, often driven by artificial stimuli with little resemblance to the outside world, correspond to responses of neurons when using more natural stimuli. In this thesis, we probe these gaps, first by demonstrating that synchronization within the visual thalamus leads to efficient representations of motion (through tuning properties) in primary visual cortex, exploiting precise timing across populations in a unique manner compared to traditional models. We then create a novel ``minimally-natural" stimulus with the appearance of an infinite hallway wallpapered with sinusoidal gratings, to probe how such minimally natural features modulate our predictions of neural responses based upon feature tuning properties. Through encoding and decoding models we find that measuring a restricted tuning parameter space limits our ability to capture all response properties but preserves relevant information for decoding. We finish with an exploration of ethologically relevant natural features, perspective and complex motion, and show that even moderate amounts of each feature within or near the classical V1 receptive field changes the neural response from what classical feature tuning would predict and improves stimulus classification tremendously. Together all of these results indicate that capturing information about motion in the outside world through visual stimuli requires a more advanced model of feature selectivity that incorporates parameters based on more complex spatial relationships.


  • Workflow Status: Published
  • Created By: Danielle Ramirez
  • Created: 09/11/2014
  • Modified By: Fletcher Moore
  • Modified: 10/07/2016

Target Audience