Study of finite-size exponential firing neural networks in the replica-mean-field limit

dc.contributor.advisorTaillefumier, Thibaud
dc.contributor.advisorDemkov, Alexander A.
dc.contributor.committeeMemberMorrison, Philip J.
dc.contributor.committeeMemberMarder, Michael P.
dc.contributor.committeeMemberHazeltine, Richard D.
dc.creatorYu, Luyan
dc.creator.orcid0000-0003-0953-6939
dc.date.accessioned2022-08-01T22:50:28Z
dc.date.available2022-08-01T22:50:28Z
dc.date.created2022-05
dc.date.issued2022-05-03
dc.date.submittedMay 2022
dc.date.updated2022-08-01T22:50:29Z
dc.description.abstractFinite-size effects, inducing neural variability, metastability and dynamical phase transition, play significant roles in neuronal dynamics. However, due to the complicated activity correlation, quantitative analysis of spiking neural networks typically requires simplifying mean-field approximations, which effectively degrade the finite-sizeness. In this dissertation, we address this challenge by studying the spiking neural networks in the recently proposed replica-mean-field (RMF) limit. In this limit, networks are made of infinitely many replicas of the original finite-size networks with randomized interactions across replicas. Such limit has rendered certain excitatory networks fully tractable with explicit dependence on the finite size of the network constituents. We extend the RMF computational framework to a class of spiking neural networks with exponential stochastic intensities, which naturally admit mixed excitation and inhibition. Within this setting, we show that metastable finite-size networks admit multistable solutions in the RMF limit, which are fully characterized by stationary firing rates. Technically, these firing rates are determined as the solutions of a set of delayed differential equations under certain regularity conditions that any physical solutions shall satisfy. We solve this problem by combining the resolvent formalism and singular-perturbation theory. Importantly, we find that these rates specify probabilistic pseudo-equilibria, which accurately capture the neural variability observed in the original finite-size network. We also discuss the emergence of metastability as a stochastic bifurcation, which can be interpreted as a static phase transition in the RMF limit. Furthermore, we show that the static picture of the RMF limit provides adequate information to infer dynamical features of metastable finite-size networks, such as the transition rates between pseudo-equilibria.
dc.description.departmentPhysics
dc.format.mimetypeapplication/pdf
dc.identifier.urihttps://hdl.handle.net/2152/115041
dc.identifier.urihttp://dx.doi.org/10.26153/tsw/41944
dc.language.isoen
dc.subjectSpiking neural network
dc.subjectReplica-mean-field limit
dc.subjectStochastic neuronal model
dc.subjectStochastic intensity
dc.subjectPoint process
dc.subjectFinite-size effect
dc.subjectNeural variability
dc.subjectMetastability
dc.subjectPhase transition
dc.titleStudy of finite-size exponential firing neural networks in the replica-mean-field limit
dc.typeThesis
dc.type.materialtext
thesis.degree.departmentPhysics
thesis.degree.disciplinePhysics
thesis.degree.grantorThe University of Texas at Austin
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy

Access full-text files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
YU-DISSERTATION-2022.pdf
Size:
15.38 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
4.45 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.84 KB
Format:
Plain Text
Description: