ISyE Seminar- Francois Baccelli

Event Details
  • Date/Time:
    • Tuesday March 16, 2021
      11:00 am - 12:00 pm
  • Location: Virtual
  • Phone:
  • URL: Virtual Link
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Replica-mean-field limits for intensity-based neural networks

Full Summary: Abstract: Due to the inherent complexity of neural models, relating the spiking activity of a network to its structure requires simplifying assumptions, such as considering models in the thermodynamic mean-field limit. In this last limit, an infinite number of neurons interact via vanishingly small interactions, thereby erasing the finite size geometry of interactions. To better capture the geometry in question, we analyze the activity of neural networks in the replica-mean-field limit regime. Such models are made of infinitely many replicas which interact according to the same basic structure as that of the finite network of interest. Our main contribution is an analytical characterization of the stationary dynamics of intensity-based neural networks with spiking reset and heterogeneous excitatory synapses in this replica-mean-field limit. Specifically, we functionally characterize the stationary dynamics of these limit networks via ordinary or partial differential equations derived from the Poisson Hypothesis of queuing theory. We then reduce this functional characterization to a system of self-consistency equations specifying the stationary neuronal firing rates. Of general applicability, our approach combines the rate-conservation principle from point-process theory and analytical considerations from generating-function methods. Such limits can be used for first-order models, whereby elementary replica constituents are single neurons with independent Poisson inputs, and in second-order models, where these constituents are pairs of neurons with exact pairwise interactions. In both cases, these replica-mean-field networks provide tractable versions that retain important features of the finite structure of interest. We validate our approach by demonstrating numerically that replica-mean-field models better capture the dynamics of feed-forward neural networks with large, sparse connections than their thermodynamic counterparts. We also illustrate the practical interest of this approach by analyzing some neuronal rate-transfer functions and by computing the correlation structure of certain pair-dominated network dynamics.   Joint work with T. Taillefumier.

Title: Replica-mean-field limits for intensity-based neural networks

 

Abstract: Due to the inherent complexity of neural models, relating the spiking activity of a network to its structure requires simplifying assumptions, such as considering models in the thermodynamic mean-field limit. In this last limit, an infinite number of neurons interact via vanishingly small interactions, thereby erasing the finite size geometry of interactions. To better capture the geometry in question, we analyze the activity of neural networks in the replica-mean-field limit regime. Such models are made of infinitely many replicas which interact according to the same basic structure as that of the finite network of interest. Our main contribution is an analytical characterization of the stationary dynamics of intensity-based neural networks with spiking reset and heterogeneous excitatory synapses in this replica-mean-field limit. Specifically, we functionally characterize the stationary dynamics of these limit networks via ordinary or partial differential equations derived from the Poisson Hypothesis of queuing theory. We then reduce this functional characterization to a system of self-consistency equations specifying the stationary neuronal firing rates. Of general applicability, our approach combines the rate-conservation principle from point-process theory and analytical considerations from generating-function methods. Such limits can be used for first-order models, whereby elementary replica constituents are single neurons with independent Poisson inputs, and in second-order models, where these constituents are pairs of neurons with exact pairwise interactions. In both cases, these replica-mean-field networks provide tractable versions that retain important features of the finite structure of interest. We validate our approach by demonstrating numerically that replica-mean-field models better capture the dynamics of feed-forward neural networks with large, sparse connections than their thermodynamic counterparts. We also illustrate the practical interest of this approach by analyzing some neuronal rate-transfer functions and by computing the correlation structure of certain pair-dominated network dynamics.

 

Joint work with T. Taillefumier.

 

Bio: F. Baccelli is Simons Math+ECE Chair at UT Austin and part time researcher at INRIA. His research directions are at the interface between Applied Mathematics and Communications. He is co-author of research monographs on point processes and queues, max plus algebras and network dynamics, stationary queuing networks, stochastic geometry and wireless networks. He received the France Télécom Prize of the French Academy of Sciences in 2002, the ACM Sigmetrics Achievement Award in 2014, the 2014 Stephen O. Rice Prize, and the 2014 Leonard G. Abraham Prize of the IEEE Communications Theory Society. He is a member of the French Academy of Sciences.

Additional Information

In Campus Calendar
Yes
Groups

School of Industrial and Systems Engineering (ISYE)

Invited Audience
Faculty/Staff, Postdoc, Public, Graduate students, Undergraduate students
Categories
Seminar/Lecture/Colloquium
Keywords
No keywords were submitted.
Status
  • Created By: sbryantturner3
  • Workflow Status: Published
  • Created On: Mar 9, 2021 - 8:26am
  • Last Updated: Mar 9, 2021 - 8:28am