Abstract
The brain’s activity is characterized by the interaction of a very large number of neurons that are strongly affected by noise. However, signals often arise at macroscopic scales integrating the effect of many neurons into a reliable pattern of activity. In order to study such large neuronal assemblies, one is often led to derive mean-field limits summarizing the effect of the interaction of a large number of neurons into an effective signal. Classical mean-field approaches consider the evolution of a deterministic variable, the mean activity, thus neglecting the stochastic nature of neural behavior. In this article, we build upon two recent approaches that include correlations and higher order moments in mean-field equations, and study how these stochastic effects influence the solutions of the mean-field equations, both in the limit of an infinite number of neurons and for large yet finite networks. We introduce a new model, the infinite model, which arises from both equations by a rescaling of the variables and, which is invertible for finite-size networks, and hence, provides equivalent equations to those previously derived models. The study of this model allows us to understand qualitative behavior of such large-scale networks. We show that, though the solutions of the deterministic mean-field equation constitute uncorrelated solutions of the new mean-field equations, the stability properties of limit cycles are modified by the presence of correlations, and additional non-trivial behaviors including periodic orbits appear when there were none in the mean field. The origin of all these behaviors is then explored in finite-size networks where interesting mesoscopic scale effects appear. This study leads us to show that the infinite-size system appears as a singular limit of the network equations, and for any finite network, the system will differ from the infinite system.











Similar content being viewed by others
Notes
Note that the activation functions are different in BCC and Bressloff models.
References
Abbott, L. F, & Van Vreeswijk, C. A. (1993). Asynchronous states in networks of pulse-coupled neuron. Physical Review, 48, 1483–1490.
Amari, S. (1972). Characteristics of random nets of analog neuron-like elements. Syst. Man Cybernet., SMC-2.
Amari, S.-I. (1977). Dynamics of pattern formation in lateral-inhibition type neural fields. Biological Cybernetics, 27, 77–87.
Amit, D. J., & Brunel, N. (1997). Model of global spontaneous activity and local structured delay activity during delay periods in the cerebral cortex. Cerebral Cortex, 7, 237–252.
Arnold, V.I. (1981). Ordinary differential equations (Chap. 5). MIT Press.
Arnold, L. (1995). Random dynamical systems. Springer.
Beggs, J. M., & Plenz, D. (2004). Neuronal avalanches are diverse and precise activity patterns that are stable for many hours in cortical slice cultures. Journal of Neuroscience, 24, 5216–5229.
Benayoun, M., Cowan, J. D., van Drongelen, W., & Wallace, E. (2010). Avalanches in a stochastic model of spiking neurons. PLoS Computational Biology, 6(7), e1000846. doi:10.1371/journal.pcbi.1000846.
Boland, R. P., Galla, T., & McKane, A. J. (2008). How limit cycles and quasi-cycles are related in systems with intrinsic noise. Journal of Statistical Mechanics: Theory and Experiment, 9, P09001.
Bressfloff, P. (2010). Stochastic neural field theory and the system-size expansion. SIAM Journal on Applied Mathematics, 70, 1488–1521.
Bressfloff, P. (2010). Metastable states and quasicycles in a stochastic Wilson–Cowan model. Physical Review E, 82, 051903.
Brewer, J. (1978). Kronecker products and matrix calculus in system theory. IEEE Transactions on Circuits and Systems, CAS-25.
Brunel, N. (2000). Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. Journal of Computational Neuroscience, 8, 183–208.
Brunel, N., & Hakim, V. (1999). Fast global oscillations in networks of integrate-and-fire neurons with low firing rates. Neural Computation, 11, 1621–1671.
Brunel, N., & Latham, P. (2003). Firing rate of noisy quadratic integrate-and-fire neurons. Neural Computation, 15, 2281–2306.
Buice, M. A., & Cowan, J. D. (2007). Field theoretic approach to fluctuation effects for neural networks. Physical Review E, 75, 051919.
Buice, M. A., Cowan, J. D., & Chow, C. C. (2010). Systematic fluctuation expansion for neural network activity equations. Neural Computation, 22, 377–426.
Cai, D., Tao, L., Shelley, M., & McLaughlin, D. W. (2004). An effective kinetic representation of fluctuation-driven neuronal networks with application to simple and complex cells in visual cortex. Proceedings of the National Academy of Sciences, 101, 7757–7762.
Coombes, S., & Owen, M. R. (2005). Bumps, breathers, and waves in a neural network with spike frequency adaptation. Physical Review Letters 94, 148102-1–148102-4.
Doob, J. L. (1945). Markoff chains–denumerable case. Transactions of the American Mathematical Society, 58, 455–473.
Dykman, M. I., Mori, E., Ross, J., & Hunt, P. M. (1994). Large fluctuations and optimal paths in chemical kinetics. Journal of Chemical Physics, 100, 5735–5750.
El Boustani, S., & Destexhe, A. (2009). A master equation formalism for macroscopic modeling of asynchronous irregular activity states. Neural Computation, 21, 46–100.
El Boustani, S., & Destexhe, A. (2010). Brain dynamics at multiple scales: can one reconcile the apparent low-dimensional chaos of macroscopic variables with the seemingly stochastic behavior of single neurons? International Journal of Bifurcation and Chaos, 20, 1–16.
Ermentrout, B. (1998). Neural networks as spatio-temporal pattern-forming systems. Reports on Progress in Physics, 61, 353–430.
Ermentrout, B. (2002). Simulating, analyzing, and animating dynamical systems: A guide to XPPAUT for researchers and students. Society for Industrial Mathematics.
Ermentrout, G. B., & Cowan, J.D. (1979). A mathematical theory of visual hallucination patterns. Biological Cybernetics, 34, 137–150.
Faugeras, O., Touboul, J., & Cessac, B. (2009). A constructive mean field analysis of multi population neural networks with random synaptic weights and stochastic inputs. Frontiers in Neuroscience, 3, 1. doi:10.3389/neuro.10.001.2009.
Freidlin, M. I., & Wentzell, A. D. (1998). Random perturbations of dynamical systems. Springer Verlag.
Gaspard, P. (2002). Correlation time of mesoscopic chemical clocks. Journal of Chemical Physics, 117, 8905–8916.
Gillespie, D. T. (1976). A general method for numerically simulating the stochastic time evolution of coupled chemical reactions. Journal of Computational Physics, 22, 403–434.
Gillespie, D. T. (1977). Exact stochastic simulation of coupled chemical reactions. The Journal of Physical Chemistry, 81, 2340–2361.
Guckenheimer, J., & Holmes, P. J. (1983). Nonlinear oscillations, dynamical systems and bifurcations of vector fields. Applied mathematical sciences (Vol. 42). Springer.
Horsthemke, W., & Lefever, R. (2006). Noise-induced transitions. Springer.
Kurtz, T. G. (1976). Limit theorems and diffusion approximations for density dependent Markov chains. Mathematical Programming Studies, 5, 67.
Kuznetsov, Y. A. (1998). Elements of applied bifurcation theory. Applied Mathematical Sciences (2nd ed.). Springer.
Laing, C. L., Troy, W. C., Gutkin, B., & Ermentrout, G. B. (2002). Multiple bumps in a neuronal model of working memory. SIAM Journal on Applied Mathematics 63, 62–97.
Levina, A., Herrmann, J. M., & Geisel, T. (2009). Phase transitions towards criticality in a neural system with adaptive interactions. Physical Review Letters, 102, 118110-1–118110-4.
Ly, C., & Tranchina, D. (2007). Critical analysis of dimension reduction by a moment closure method in a population density approach to neural network modeling. Neural Computation, 19, 2032–2092.
McKane, A. J., Nagy, J. D., Newman, T. J., & Stefanini, M. O. (2007). Amplified biochemical oscillations in cellular systems. Journal of Statistical Physics, 71, 165.
Mattia, M., & Del Giudice, P. (2002). Population dynamics of interacting spiking neurons. Physical Review E, 66, 51917.
Neudecker, H. (1969). Some theorems on matrix differentiation with special reference to kronecker matrix products. Journal of the American Statistical Association, 64, 953–963.
Ohira, T., & Cowan, J. D. (1993). Master-equation approach to stochastic neurodynamics. Physical Review E, 48(3), 2259–2266.
Plesser, H. E. (1999). Aspects of signal processing in noisy neurons. PhD thesis, Georg-August-Universität.
Rodriguez, R., & Tuckwell, H. C. (1996). A dynamical system for the approximate moments of nonlinear stochastic models of spiking neurons and networks. Mathematical and Computer Modeling, 31, 175–180.
Rodriguez, R., & Tuckwell, H. C. (1998). Noisy spiking neurons and networks: Useful approximations for firing probabilities and global behavior. Biosystems, 48, 187–194.
Rolls, E. T., & Deco, G. (2010). The noisy brain: Stochastic dynamics as a principle of brain function. Oxford University Press.
Softky, W. R., & Koch, C. (1993). The highly irregular firing of cortical cells is inconsistent with temporal integration of random epsps. Journal of Neuroscience, 13, 334–350.
Teramae, J., Nakao, H., & Ermentrout, G. B. (2009). Stochastic phase reduction for a general class of noisy limit cycle oscillators. Physical Review Letters, 102, 194102.
Touboul, J., & Destexhe, A. (2010). Can power-law scaling and neuronal avalanches arise from stochastic dynamics? PLoS ONE, 5, e8982.
Touboul, J., & Faugeras, O. (2007). The spikes trains probability distributions: a stochastic calculus approach. Journal of Physiology, 101(1–3), 78–98.
Touboul, J., & Faugeras, O. (2008) First hitting time of double integral processes to curved boundaries. Advances in Applied Probability, 40, 501–528.
Wainrib, G. (2010). Randomness in neurons: A multiscale probabilistic analysis. PhD thesis, Ecole Polytechnique.
Wilson, H. R., & Cowan, J. D. (1972). Excitatory and inhibitory interactions in localized populations of model neurons. Biophysical Journal, 12, 1–24.
Wilson, H. R., & Cowan, J. D. (1973). A mathematical theory of the functional dynamics of cortical and thalamic nervous tissue. Biological Cybernetics, 13, 55–80.
Acknowledgement
This work was supported by NSF DMS 0817131.
Author information
Authors and Affiliations
Corresponding author
Additional information
Action Editor: Nicolas Brunel
Appendices
A An alternative derivation of the moment equations
In this appendix we show that the moment equation corresponding to the Markov chain governed by Eq. (2) can be derived from a Rodriguez–Tuckwell moment expansion on the Langevin approximation of the Markov chain. Following Kurtz approach in Kurtz (1976), we know that the dynamics of the our rescaled variables p i = n i /N i , which is a Markov chain in the initial setting, approaches as N goes to infinity a continuous diffusion process (X t ) t ≥ 0 satisfying the equation:
where \(S_i(t,X_t)=\sum_{j=1}^N w_{ij}X^j_t + I_i(t)\).
Following the works of Rodriguez and Tuckwell (Rodriguez and Tuckwell 1996, 1998), we derive from this equation the dynamical system governing the approximate moments of X. Denoting by m j the mean value of X j and by C ij the correlation between X i and X j , a direct application of Rodriguez and Tuckwell formula applied to our particular form of dynamical system yields:
Where \(s_i=\sum_{j=1}^N w_{ij}m^j_t + I_i(t)\). These equations therefore appear as a perturbation of the BCC equations with an additional nonlinear term in the dynamics or order two. Truncation at order 1 in the small parameter 1/N yields exactly BCC equations. Moreover, it is interesting to note that these equations again correspond in the limit N→ ∞, to the infinite model described in Section 3.
B Kronecker algebra: some useful properties
In this appendix, we review and prove some useful properties of Kronecker products of matrixes. We recall the definition of the the function Vect transforming a M × N matrix into a M N-dimensional column vector, as defined in Neudecker (1969):
Let us now denote by ⊗ the Kronecker product defined for A ∈ ℝm×n and B ∈ ℝr×s as the (m r) ×(n s) matrix:
For standard definitions and identities in the field of Kronecker products, the reader is referred to Brewer (1978). We recalled in the main text the following identities for A, B, D, G, X ∈ ℝM×M, I M be the M × M identity matrix and A · B or A B denote the standard matrix product:
The relationship ⊕ is called Kronecker sum.
Proposition 2
Let A and B in ℝM×M , and assume that A has the eigenvalues {λ i ; i = 1,..., M} and B the eigenvalues {μ i ; i = 1,..., M}. Then we have:
-
A ⊗ B has the eigenvalues \(\{\lambda_i\mu_j;\;(i,j)\in \{1,\ldots,M\}^2\}\) .
-
A ⊕ B has the eigenvalues \(\{\lambda_i+\mu_j;\;(i,j)\in \{1,\ldots,M\}^2\}\) .
Proof
Let λ i (resp.μ j ) be an eigenvalue of A (resp. B) with eigenvector u (resp. v), and define the matrix z = u·v T (i.e. z ij = u i v j ). We have:
which entails that Vect(z) is an eigenvector of A(ν) ⊕ A(ν) associated with the eigenvalue λ i + λ j . Similarly, we have for z = v · u T:
The dimension of A ⊕ B and A ⊗ B is M 2 and there are exactly M 2 linearly independent matrices z possible built in the proposed fashion, therefore we identified all possible eigenvalues. □
Theorem 3
Let Φ(t) be the solution of the matrix differential equation:
and Ψ(t) the solution of:
We have Ψ(t) = Φ(t) ⊗ Φ(t).
Proof
Indeed, we have, using the basic properties of the Kronecker product recalled in Eq. (8)
Therefore Φ(t) ⊗ Φ(t) satisfies the same differential equation as Ψ(t) and moreover, \(\Phi(0)\otimes \Phi(0)=I_M\otimes I_M=I_{M^2}\), and therefore by existence and uniqueness of the resolvent, Ψ(t) = Φ(t) ⊗ Φ(t). □
C Genericity of the Hopf bifurcation found
In this appendix we derive the expression of the first Lyapunov exponent of the bifurcation, which proves that the existence of the Hopf bifurcation exhibited in Section 4.1.2. In that section, we derived the expression of the Jacobian matrix at the considered fixed point:
At the bifurcation point, we have − α * + w f ′(s) = 0, and therefore at this point the Jacobian matrix reads:
The eigenvalues of this matrix under the assumptions of Section 4.1.2 are ±i ω 0 where \(\omega_0=\sqrt{-f^{\prime}(I)f^{\prime\prime}(I)w^3/N}\). We define q the right eigenvector of J 0 associated with i ω 0:
and p the right eigenvector of \(J_0^T\) associated with the eigenvalue iω 0:
For the sake of simplicity, we also name the components of the vector field of the system:
Following Kuznetsov (1998), we define \(B(\binom{x_1}{y_1}, \binom{x_2}{y_2})\) and \(C(\binom{x_1}{y_1}, \binom{x_2}{y_2},\binom{x_3}{y_3})\) the second and third derivatives of the vector field, which are bi- and tri-linear forms. We have the following expressions for these multilinear functions:
We are now in position to compute the first Lyapunov exponent l 1(0) using the formula:
where \(\langle x, y \rangle\) denotes the complex inner product \(\overline{x}^T\cdot y\) and the sum of three terms denoted \(\mathcal{A}\), \(\mathcal{B}\) and \(\mathcal{C}\) are the real parts of the terms involved in the expression of the Lyapunov exponent. After straightforward but tedious calculations (that can be conveniently performed using a formal calculation tool such as Maple), we obtain:
which yields the expression for the Lyapunov exponent:
D Finite-size effects in BCC two-populations model I
In this appendix, we study the two-populations BCC system corresponding to Model I and show that the finite-size effects are closely related to what is observed in Bressloff model as studied in Section 4.2. BCC finite-size equations read:
where we denoted:
Similarly to Bressloff case, BCC model features two families of limit cycles (see Fig. 12). One of these branches corresponds exactly to the branch of limit cycles of WC system starting from a Hopf bifurcation and disappearing through a homoclinic bifurcations. Two additional Hopf bifurcations appear related to the family of periodic orbit corresponding to the correlation-induced cycle evidence in the analysis of the infinite-size system. This branch of limit cycles exist whatever n, and loses stability through a Neimark-Sacker bifurcation as the number of neurons increases. This bifurcation generates chaos for large enough networks, which clearly does not exists in WC system.
Bifurcation diagram for the BCC system. Blue lines represent the equilibria, pink lines the extremal values of the cycles in the system. Bifurcations of equilibria are denoted with a red star, LP represents a saddle-node bifurcation (Limit Point), H a Hopf bifurcation. The four Hopf bifurcations share two families of limit cycles. The branch corresponding to the smaller values of i 1 undergoes two fold of limit cycles, and the other branch of limit cycle a Neimark Sacker (Torus) bifurcation
This family of limit cycles presents stability for networks containing between 100 and 17.000 neurons, corresponding to a finite size effect that appears only in the region of interest of the cortical columns. As the number of neurons increase, the system keep the same number and type of bifurcations, and the infinite model appears to be a singular case where different bifurcations meet (see Fig. 13), and very large networks lose the property of presenting a stable cycle, and present a chaotic behavior.
Rights and permissions
About this article
Cite this article
Touboul, J.D., Ermentrout, G.B. Finite-size and correlation-induced effects in mean-field dynamics. J Comput Neurosci 31, 453–484 (2011). https://doi.org/10.1007/s10827-011-0320-5
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10827-011-0320-5