<node id="681149">
  <nid>681149</nid>
  <type>event</type>
  <uid>
    <user id="28475"><![CDATA[28475]]></user>
  </uid>
  <created>1741879186</created>
  <changed>1741879265</changed>
  <title><![CDATA[Ph.D. Dissertation Defense - Biswadeep Chakraborty]]></title>
  <body><![CDATA[<p><strong>Title</strong><em>:&nbsp; Temporal Intelligence in Spiking Neural Networks: A New Framework for Learning and Adaptation</em></p><p><strong>Committee:</strong></p><p>Dr.&nbsp;Saibal Mukhopadhyay, ECE, Chair, Advisor</p><p>Dr.&nbsp;Suman Datta, ECE</p><p>Dr.&nbsp;Justin Romberg, ECE</p><p>Dr.&nbsp;Callie Hao, ECE</p><p>Dr.&nbsp;Celine Lin, CoC</p>]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Temporal Intelligence in Spiking Neural Networks: A New Framework for Learning and Adaptation ]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>Spiking Neural Networks (SNNs) have been studied primarily as an energy-efficient alternative to deep learning due to their sparse, event-driven computation. However, this dissertation takes a fundamentally different perspective—reframing SNNs as a powerful computational paradigm capable of real-time adaptation, structured memory, and robust processing of dynamic, non-stationary inputs. A key limitation of conventional SNN models is their reliance on homogeneous neuron and synapse dynamics, which restricts their ability to capture multi-scale temporal dependencies and limits their computational expressivity. Drawing from dynamical systems theory, this work demonstrates that heterogeneity in neuronal and synaptic timescales significantly enhances an SNN’s ability to encode, retain, and process temporal information. To formalize this insight, I introduce Heterogeneous Recurrent Spiking Neural Networks (HRSNNs), which leverage diverse timescales to improve learning efficiency, stability, and generalization.</p><p>Beyond architectural innovations, this dissertation establishes a rigorous mathematical framework that bridges spiking computation with state-space models (SSMs), Lyapunov stability analysis, and topological learning representations, providing formal guarantees on stability, convergence, and adaptability. Furthermore, I propose task-agnostic pruning, a novel approach that optimizes SNN sparsity not through heuristics, but by preserving key dynamical properties, ensuring computational efficiency without sacrificing expressivity. Additional contributions, including Spiking Graph Neural Networks (SGNNs) and Spiking State-Space Models (S-SSMs), extend SNNs beyond temporal processing into structured and continuous domains. Through applications in unsupervised learning, time-series prediction, multi-agent interactions, and event-based perception, this work positions SNNs as more than just efficient models—it establishes them as a new frontier in adaptive, real-time intelligence.</p>]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2025-03-17T12:00:00-04:00]]></value>
      <value2><![CDATA[2025-03-17T14:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[Room 3126, Klaus ]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>434381</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[ECE Ph.D. Dissertation Defenses]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>100811</tid>
        <value><![CDATA[Phd Defense]]></value>
      </item>
          <item>
        <tid>1808</tid>
        <value><![CDATA[graduate students]]></value>
      </item>
      </field_keywords>
  <field_userdata><![CDATA[]]></field_userdata>
</node>
