Study of finite-size exponential firing neural networks in the replica-mean-field limit
Finite-size effects, inducing neural variability, metastability and dynamical phase transition, play significant roles in neuronal dynamics. However, due to the complicated activity correlation, quantitative analysis of spiking neural networks typically requires simplifying mean-field approximations, which effectively degrade the finite-sizeness. In this dissertation, we address this challenge by studying the spiking neural networks in the recently proposed replica-mean-field (RMF) limit. In this limit, networks are made of infinitely many replicas of the original finite-size networks with randomized interactions across replicas. Such limit has rendered certain excitatory networks fully tractable with explicit dependence on the finite size of the network constituents. We extend the RMF computational framework to a class of spiking neural networks with exponential stochastic intensities, which naturally admit mixed excitation and inhibition. Within this setting, we show that metastable finite-size networks admit multistable solutions in the RMF limit, which are fully characterized by stationary firing rates. Technically, these firing rates are determined as the solutions of a set of delayed differential equations under certain regularity conditions that any physical solutions shall satisfy. We solve this problem by combining the resolvent formalism and singular-perturbation theory. Importantly, we find that these rates specify probabilistic pseudo-equilibria, which accurately capture the neural variability observed in the original finite-size network. We also discuss the emergence of metastability as a stochastic bifurcation, which can be interpreted as a static phase transition in the RMF limit. Furthermore, we show that the static picture of the RMF limit provides adequate information to infer dynamical features of metastable finite-size networks, such as the transition rates between pseudo-equilibria.