<node id="689468">
  <nid>689468</nid>
  <type>event</type>
  <uid>
    <user id="27707"><![CDATA[27707]]></user>
  </uid>
  <created>1775478945</created>
  <changed>1775478979</changed>
  <title><![CDATA[PhD Defense by TUCKER LANCASTER]]></title>
  <body><![CDATA[<p>In&nbsp;partial&nbsp;fulfillment&nbsp;of&nbsp;the&nbsp;requirements&nbsp;for&nbsp;the&nbsp;degree&nbsp;of</p><h2>Doctor&nbsp;of&nbsp;Philosophy&nbsp;in&nbsp;Quantitative&nbsp;Biosciences</h2><p>In&nbsp;the</p><p><strong>School&nbsp;of&nbsp;Biological&nbsp;Sciences</strong></p><p>&nbsp;</p><p>TUCKER&nbsp;LANCASTER</p><p>&nbsp;</p><p>Will&nbsp;defend&nbsp;his&nbsp;dissertation</p><p>&nbsp;</p><p>SYNTHETIC EYES ON NATURAL LIVES: MACHINE LEARNING APPROACHES TO</p><p><strong>BEHAVIORAL ANALYSIS IN ECOLOGICALLY VALID CICHLID ENVIRONMENTS</strong></p><p>&nbsp;</p><h2>WEDNESDAY,&nbsp;APRIL&nbsp;15,&nbsp;2026&nbsp;3:00 PM EDT<br>In&nbsp;person&nbsp;location:&nbsp;CHOA&nbsp;Seminar&nbsp;Room&nbsp;in&nbsp;EBB&nbsp;Krone&nbsp;1005&nbsp;<br>Zoom link:</h2><p><a href="https://gatech.zoom.us/j/96134265799?pwd=2HVbs8fx9mlcPzuQYt774PPH9hIuwT.1">https://gatech.zoom.us/j/96134265799?pwd=2HVbs8fx9mlcPzuQYt774PPH9hIuwT.1</a></p><p>&nbsp;</p><h2>Thesis&nbsp;Advisor:</h2><p>Patrick McGrath, Ph.D.&nbsp;</p><p>School of Biological Sciences&nbsp;</p><p>Georgia&nbsp;Institute&nbsp;of&nbsp;Technology</p><h2>&nbsp;</h2><h2>Committee&nbsp;Members:</h2><p>Jeffrey&nbsp;Todd&nbsp;Streelman,&nbsp;Ph.D.&nbsp;</p><p>School of Biological Sciences&nbsp;</p><p>Georgia&nbsp;Institute&nbsp;of&nbsp;Technology</p><p>&nbsp;</p><p>Gordon&nbsp;Berman,&nbsp;Ph.D.&nbsp;</p><p>Department&nbsp;of&nbsp;Biology&nbsp;</p><p>Emory University</p><p>&nbsp;</p><p>Eva Dyer, Ph.D.&nbsp;</p><p>Department&nbsp;of&nbsp;Bioengineering</p><p>University&nbsp;of&nbsp;Pennsylvania&nbsp;</p><p>&nbsp;</p><p>Anqi Wu, Ph.D.</p><p>College of Computing&nbsp;<br>Georgia&nbsp;Institute&nbsp;of&nbsp;Technology</p><p>&nbsp;</p><p>&nbsp;</p><p><strong>ABSTRACT:&nbsp;</strong>A&nbsp;central&nbsp;tension&nbsp;runs&nbsp;through&nbsp;computational&nbsp;ethology:&nbsp;constraining&nbsp;the&nbsp;experimental&nbsp;environment&nbsp;simplifies&nbsp;automated&nbsp;measurement&nbsp;but&nbsp;can&nbsp;degrade&nbsp;or&nbsp;remove&nbsp;the&nbsp;very&nbsp;behaviors that make measurement worthwhile.&nbsp;This dissertation develops six computational tools that navigate this tension for cichlid fishes of Lake Malawi, enabling automated behavioral measurement at the level of ecological validity each biological question demands.&nbsp;The tools are organized across three aims, progressing from constrained single-camera experiments to multi-camera 3D monitoring in large naturalistic aquaria.</p><p>The&nbsp;first&nbsp;two&nbsp;aims&nbsp;develop&nbsp;small-tank&nbsp;tools&nbsp;that&nbsp;enabled&nbsp;neurogenomic&nbsp;studies&nbsp;linking&nbsp;behavioral&nbsp;phenotypes&nbsp;from&nbsp;automated&nbsp;measurement&nbsp;to&nbsp;cell-type-specific&nbsp;gene&nbsp;expression.&nbsp;For&nbsp;sand-dwelling bower-builders,&nbsp;whose&nbsp;primary&nbsp;behaviorally&nbsp;relevant&nbsp;outcome&nbsp;is&nbsp;the&nbsp;physical&nbsp;structure&nbsp;the&nbsp;animal builds&nbsp;rather&nbsp;than&nbsp;its&nbsp;body&nbsp;configuration&nbsp;alone,&nbsp;the&nbsp;CichlidBowerTracking&nbsp;system&nbsp;combines&nbsp;depth sensing of sand surface topography with video-based action recognition at human-level classification accuracy, scaling to 33 simultaneous experiments on low-cost single-board computers. For rock-dwelling mbuna, whose courtship leaves no environmental trace, two complementary tools—SARTAB&nbsp;for&nbsp;edge-deployed&nbsp;real-time&nbsp;courtship&nbsp;detection&nbsp;and&nbsp;DemBA&nbsp;for&nbsp;identity-resolved post-hoc&nbsp;behavioral&nbsp;characterization—extended&nbsp;this&nbsp;neurogenomic&nbsp;paradigm&nbsp;to&nbsp;a&nbsp;second&nbsp;behavioral system.&nbsp;These&nbsp;tools&nbsp;are&nbsp;scalable&nbsp;and&nbsp;cost-effective&nbsp;but&nbsp;operate&nbsp;within&nbsp;simplifying&nbsp;assumptions:&nbsp;a single camera, a small arena, and at most two focal animals.</p><p>The third aim moves to 650-gallon community tanks housing mixed-sex groups monitored&nbsp;by multi-camera arrays, confronting the simplifications the earlier aims relied on:&nbsp;single-camera coverage, tractable identity among few animals, well-defined spatial regions of interest, and 2D treatment of an inherently 3D volume through a refractive medium.&nbsp;Three integrated tools share an explicit refractive projection model throughout, ensuring geometric consistency from calibration through to behavioral metrics.&nbsp;AquaCal provides refraction-aware multi-camera calibration with sub-millimeter accuracy.&nbsp;AquaMVS embeds the same refractive model into dense multi-view stereo for 3D environmental reconstruction of substrate topography.&nbsp;AquaPose combines detection,&nbsp;pose&nbsp;estimation,&nbsp;tracking,&nbsp;cross-view&nbsp;identity&nbsp;association,&nbsp;and&nbsp;refraction-aware&nbsp;triangulation to produce per-fish 3D midline trajectories, maintaining identity at a rate of 0.012 swaps per fish per 1,000 frames across 12 synchronized cameras.&nbsp;Because the refractive model reduces to standard pinhole geometry when refraction is absent, the algorithmic contributions that under-pin&nbsp;AquaPose—stateless&nbsp;cross-view&nbsp;identity&nbsp;association&nbsp;via&nbsp;community&nbsp;detection,&nbsp;keypoint-based tracking, and 3D-first identity resolution—are in principle applicable to terrestrial multi-camera systems, where they address open challenges independent of refraction.</p><p>Together, these tools demonstrate that automated behavioral measurement can achieve the precision and scale that neurogenomic studies require without sacrificing the ecological validity the biology demands.&nbsp;With this measurement infrastructure in place, the constraint on the next generation of cichlid neurogenomic studies shifts from whether behavior can be quantified in naturalistic settings to which biological questions warrant the investment of doing so.</p>]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[SYNTHETIC EYES ON NATURAL LIVES: MACHINE LEARNING APPROACHES TO BEHAVIORAL ANALYSIS IN ECOLOGICALLY VALID CICHLID ENVIRONMENTS]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>SYNTHETIC EYES ON NATURAL LIVES: MACHINE LEARNING APPROACHES TO</p><p><strong>BEHAVIORAL ANALYSIS IN ECOLOGICALLY VALID CICHLID ENVIRONMENTS</strong></p>]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2026-04-15T15:00:00-04:00]]></value>
      <value2><![CDATA[2026-04-15T17:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[CHOA Seminar Room in EBB Krone 1005 ]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>221981</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Graduate Studies]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>100811</tid>
        <value><![CDATA[Phd Defense]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
