{"title": "The Doubly Balanced Network of Spiking Neurons: A Memory Model with High Capacity", "book": "Advances in Neural Information Processing Systems", "page_first": 1247, "page_last": 1254, "abstract": "", "full_text": "The doubly balanced network of spiking \n\nneurons: a memory model with high \n\ncapacity \n\nYuval Aviel* \n\nDavid Horn \n\nHebrew University \n\nJerusalem, Israel 91904 \n\naviel@cc.huji.ac.il \n\n \n \n Interdisciplinary Center for Neural Computation School of Physics \nTel Aviv University \n \nTel Aviv, Israel 69978 \n \nhorn@post.tau.ac.il \n \n \n \n \n \n \n \n\nInterdisciplinary Center for Neural Computation \n\nJerusalem, Israel 91904 \nabeles@vms.huji.ac.il \n\nHebrew University \n\nMoshe Abeles \n\nAbstract \n\nA balanced network leads to contradictory constraints on memory \nmodels, as exemplified in previous work on accommodation of \nsynfire chains. Here we show that these constraints can be \novercome by introducing a 'shadow' inhibitory pattern for each \nexcitatory pattern of the model. This is interpreted as a double-\nbalance principle, whereby there exists both global balance \nbetween average excitatory and inhibitory currents and local \nbalance between the currents carrying coherent activity at any \ngiven time frame. This principle can be applied to networks with \nHebbian cell assemblies, leading to a high capacity of the \nassociative memory. The number of possible patterns is limited by \na combinatorial constraint that turns out to be P=0.06N within the \nspecific model that we employ. This limit is reached by the \nHebbian cell assembly network. To the best of our knowledge this \nis the first time that such high memory capacities are demonstrated \nin the asynchronous state of models of spiking neurons. \n\n1 Introduction \n\nNumerous studies analyze the different phases of unstructured networks of spiking \nneurons [1, 2]. These networks with random connectivity possess a phase of \nasynchronous activity, the asynchronous state (AS), which is the most interesting \none from the biological perspective, since it is similar to physiological data. \nUnstructured networks, however, do not hold information in their connectivity \nmatrix, and therefore do not store memories. \n\n\f \n\nBinary networks with ordered connectivity matrices, or structured networks, and \ntheir ability to store and retrieve memories, have been extensively studied in the \npast [3-8]. Applicability of these results to biologically plausible neuronal models is \nquestionable. In particular, models of spiking neurons are known to have modes of \nsynchronous global oscillations. Avoiding such modes, and staying in an AS, is a \nmajor constraint on networks of spiking neurons that is absent in most binary neural \nnetworks. As we will show below, it is this constraint that imposes a limit on \ncapacity in our model. Existing associative memory models of spiking neurons have \nnot strived for maximal pattern capacity [3, 4, 8]. \nHere, using an integrate-and-fire model, we embed structured synaptic connections \nin an otherwise unstructured network and study the capacity limit of the system. The \nsystem is therefore macroscopically unstructured, but microscopically structured. \nThe unstructured network model is based on Brunel's [1] balanced network of \nintegrate-and-fire neurons. In his model, the network possesses different phases, one \nof which is the AS. We replace his unstructured excitatory connectivity by a semi-\nstructured one, including a super-position of either synfire chains or Hebbian cell \nassemblies. \nThe existence of a stable AS is a fundamental prerequisite of the system. There are \ntwo reasons for that: First, physiological measurements of cortical tissues reveal an \nirregular neuronal activity and an asynchronous population activity. These findings \nmatch the properties of the AS. Second, in term of information content, the entropy \nof the system is the highest when firing probability is uniformly distributed, as in an \nAS. In general, embedding one or two patterns will not destabilize the AS. \nIncreasing the number of embedded patterns, however, will eventually destabilize \nthe AS, leading to global oscillations. \nIn previous work [9], we have demonstrated that the cause of AS instability is \ncorrelations between neurons that result from the presence of structure in the \nnetwork. The patterns, be it Hebbian cell assemblies (HCA) or pools occurring in \nsynfire chains (SFC), have an important characteristic: neurons that are members of \nthe same pattern (or pool) share a large portion of their inputs. This common input \ncorrelates neuronal activities both when a pattern is activated and when both \nneurons are influenced by random activity. If too many patterns are embedded in the \nnetwork, too many neurons become correlated due to common inputs, leading to \nglobally synchronized deviations from mean activity. \nA qualitative understanding of this state of affairs is provided by a simple model of \na threshold linear pair of neurons that receive n excitatory common, and correlated, \ninputs, and K-n excitatory, as well as K inhibitory, non-common uncorrelated \ninputs. Thinking of these neurons as belonging to a pattern or a pool within a \nnetwork, we can obtain an interesting self-consistent result by assuming the \ncorrelation of the pair of neurons to be also the correlation in their common \ncorrelated input (as is likely to be the case in a network loaded with HCA or SFC). \nWe find then [9] that there exists a critical pattern size, \n, below which \ncorrelations decay but above which correlations are amplified. Furthermore, the \nfollowing scaling was found to exist \n\ncn\n\n(1) \n\nn\nc\n\nr K\n\nc=\n\n. \n\nImplications of this model for the whole network are that: (i) rc is independent of \nN, the size of the network, (ii) below nc the AS is stable, and (iii) above nc the AS is \nunstable. \n\n\f \n\nUsing extensive computer simulations we were able [9] to validate all these \npredictions. In addition, keeping n nmin, by the requirement that n \nexcitatory post-synaptic potentials (PSPs), on average, drive a neuron across its \nthreshold. Since N>K and typically N>>K, together with Eq. (1) it follows that \nN >>\nc and nmin set the lower bound of the network's size, \nabove which it is possible to embed a reasonable number of patterns in the network \nwithout losing the AS. In this paper we propose a solution that enables small nmin \nand large r values, which in turn enables embedding a large number of patterns in \nmuch smaller networks. This is made possible by the doubly-balanced construction \nto be outlined below. \n\n. Hence r\n\nmin / cr\n\n(\nn\n\n)\n\n2\n\n2 The double-balance principle \n\nCounteracting the excitatory correlations with inhibitory ones is the principle that \nwill allow us to solve the problem. Since we deal with balanced networks, in which \nthe mean excitatory input is balanced by an inhibitory one, we note that this \nprinciple imposes a second type of balancing condition, hence we refer to it as the \ndouble- balance principle. \nIn the following, we apply this principle by introducing synaptic connections \nbetween any excitatory pattern and its randomly chosen inhibitory pattern. These \ninhibitory patterns, which we call shadow patterns, are activated after the excitatory \npatterns fire, but have no special in-pattern connectivity or structured projections \nonto other patterns. The premise is that correlations evolved in the excitatory \npatterns will elicit correlated inhibitory activity, thus balancing the network's \naverage correlation level. The size of the shadow pattern has to be small enough, so \nthat the global network activity will not be quenched, yet large enough, so that the \nexcitatory correlation will be counteracted. A balanced network that is embedded \nwith patterns and their shadow patterns will be referred to as a doubly balanced \nnetwork (DBN), to be contrasted with the singly balanced network (SBN) where \nshadow patterns are absent. \n\n3 Application of the double balance principle. \n\n3.1 The Network \n\nref\n\nms\n\n, \n\nms\n\n10=\u03c4\n\n5.2=\u03c4\n\nWe model neuronal activity with the Integrate and Fire [10] model. All neurons \n, C=250pF. PSPs are modeled \nhave the same parameters: \nby a delta function with fixed delay. The number of synapses on a neuron is fixed \nand set to KE excitatory synapses from the local network, KE excitatory synapses \nfrom external sources and KI inhibitory synapses from the local network. See Aviel \net al [9] for details. All synapses of each group will be given fixed values. It is \nallowed for one pre-synaptic neuron to make more than one connection to one post-\nsynaptic neuron. The network possesses NE excitatory neurons and \n\u03b5=\ninhibitory neurons. Connectivity is sparse, \n1.0=\u03b5\n\nN \u03b3\u2261\nN\n \nE\n, (we use \n). A Poisson process with rate vext=10Hz models the external source. If a \n\nNEK\n\nNK\n\n=\n\nE\n\nI\n\nI\n\nI\n\n\fneuron of population y innervates a neuron of population x its synaptic strength \nis defined as \n\nxyJ\n\n \n\n \n\n \n\n \n\nJ\n\nxE\n\n\u2261\n\nJ\n\n0\n\nK\n\nE\n\n, \n\nJ\n\nK\n\nI \n\nxI \u2261 \u2212\n\ngJ\n\n0\ng\nJ\n\u03b3\u2212=\n\nxI\n\nxE\n\nJ\n\n, hence \n\ng controls the balance \n\u03b3\n\nwith J0=10, and g=5. Note that \nbetween the two populations. \nWithin an HCA pattern the neurons have high connection probability with one \nanother. Here it is achieved by requiring L of the synapses of a neuron in the \nexcitatory pattern to originate from within the pattern. Similarly, a neuron in the \ninhibitory shadow pattern dedicates L of its synapses to the associated excitatory \npattern. In a SFC, each neuron in an excitatory pool is fed by L neurons from the \nprevious pool. This forms a feed forward connectivity. In addition, when shadow \npools are present, each neuron in a shadow pool is fed by L neurons from its \nassociated excitatory pool. \nL KCL =\n\n, with CL=2.5. The size of the excitatory patterns (i.e. \nIn both cases \nthe number of neurons participating in a pattern) or pools, nE, is also chosen to be \nproportional to \n varies. \n, where \nE\nThis is a suitable choice, because of the behavior of the critical n\nc of Eq. (1), and is \nneeded for the meaningful memory activity (of the HCA or SFC) to overcome \nsynaptic noise. \n\nEK (see Aviel et al. 2003 [9]), \n\nKC\n\nn \u2261\nE\n\nnC\n\nE\n\nn\n\n. This leads to the factor d, \nThe size of a shadow pattern is defined as \nrepresenting the relative strength of inhibitory and excitatory currents, due to a \npattern or pool, affecting a neuron that is connected to both: \n\nE\n\nI\n\n~\nnd\n\n\u2261\n\nn\n\n(2) \n\nd\n\n\u2261\n\n=\n\ngJ\n0\nJ\n0\n\n(cid:4)\nK d\nE\nK\nI\n\n=\n\n(cid:4)\ngd\n\u03b3\n\n. \n\nJ\n\nn\n\u2212\nxI\nI\nJ\nn\nxE E\n) E\nn\n\n(\n\n\u03b3\n\nn\n\n=\n\nd\n\nI\n\ng\n\n. In the simulations reported below d varied between 1 \n\nThus it fixes \nand 3. \nWiring the network is done in two stages, first all excitatory patterns are wired, and \nthen random connections are added, complying with the fixed number of synapses. \nA volley of w spikes, normally distributed over time with width of 1ms, is used to \nignite a memory pattern. In the case of SFC, the first pool is ignited, and under the \nright conditions the volley propagates along the chain without fading away and \nwithout destabilizing the AS. \n\n3.2 Results \n\nFirst we show that the AS remains stable when embedding HCAs in a small DBN, \nwhereas global oscillations take place if embedding is done without shadow pools. \nFigure 1 displays clearly the sustained activity of an HCA in the DBN. \n\n\fThe same principle also enables embedding of SFCs in a small network. This is to \nbe contrasted with the conclusions drawn in Aviel et al [9], where it was shown that \notherwise very large networks are necessary to reach this goal. \n\n \n\n \n\nFigure 1: HCAs are embedded in a balanced network without (left) and with (right) \nshadow patterns. P=300 HCAs of size nE=194 excitatory neurons were embedded in \na network of NE=15,000 excitatory neurons. The eleventh pattern is externally \nignited at time t=100ms. A raster plot of 200ms is displayed. Without shadow \npatterns the network exhibits global oscillations, but with shadow patterns the \nnetwork exhibits only minute oscillations, enabling the activity of the ignited \npattern to be sustained. The size of the shadow patterns is set according to Eq. (2) \nwith d=1. Neurons that participate in more than one HCA may appear more than \nonce on the raster plot, whose y-axis is ordered according to HCAs, and represents \nevery second neuron in each pattern. \n\nFigure 2: SFCs embedded in a balanced network without (left) and with (right) \nshadow patterns. The first pool is externally ignited at time t=100ms. d=0.5. The \nrest of the parameters are as in Figure 1. Here again, without shadow pools, the \nnetwork exhibits global oscillations, but with shadow pools it has only minute \noscillation, enabling a stable propagation of the synfire wave. \n\n \n\n3.3 Maximum Capacity \n\nIn this section we show that, within our DBN, it is the fixed number of synapses \n(rather than dynamical constraints) that dictates the maximal number of patterns or \npools P that may be loaded onto the network. Let us start by noting that a neuron of \n mN x\npopulation x (E or I) can participate in at most \n\n patterns, hence \n\nm\n\n\u2261\n\n\uf8fbLK\n\uf8f0\n\nE\n\n\f \n\nsets an upper bound on the number of neurons that participate in all patterns: \nNmPn\nx\n\n. Next, defining \n\n, we find that \n\n\u03b1 \u2261\n\n\u2264\n\n\u22c5\n\nx\n\nx\n\nP\nNx\n\n(3) \n\n\u03b1 \u2264\n\nx\n\nm\nn\nx\n\nK\n\n\uf8ef\n\uf8f0\n\nE\n\n=\n\nC\nL\nn\nx\n\nK\n\nE\n\n\uf8fa\n\uf8fb \n\nTo leading order in NE this turns into \n\n(4) \n\nN\nx\u03b1\n\nx\n\n=\n\nwhere \n\nDx \u2261\n\nd\n\nC\n\nK\n\n\uf8ef\n\uf8f0\n\nE\nL\nD C\nx n\n(\n)\u03b3g\n\n if x=I, or 1 for x=E. \n\nK\n\nE\n\n\uf8fa\n\uf8fb\n\nK\n\nE\n\nN\n\nE\n\n=\n\n(\n\nC C D\n\nn\n\nL\n\n1\n\u2212\n\n)\n\nx\n\nN\n\nE\n\n\u2212\n\n(\nO NE\n\n)\n\n \n\nE\n\nn\n\nL\n\n(\n\n=\n\n1\u2212\n\n)\n\nN\n\nC C Dn L I\n\nCCP\n\n\u03b3\u03b1\nI \u03b1<\nE\n) 1\n\u2212\nNE\n. \n\nThus we conclude that synaptic combinatorial considerations lead to a maximal \nnumber of patterns P. If DI<1, including the case DI=0 of the SBN, the excitatory \nneurons determine the limit to be \n. If, as is the case in our DBN, \nDI>1, then \n and the inhibitory neurons set the maximum value to \n(P\n=\nFor example, setting Cn=3.5, CL=2.4, g=3 and d=3, in Eq. (4), we get P=0.06NE. In \nFigure 3 we use these parameters. The capacity of a DBN is compared to that of an \nSBN for different network sizes. The maximal load is defined by the presence of \nglobal oscillation strong enough to prohibit sustained activity of patterns. The DBN \nreaches the combinatorial limit, whereas the SBN does not increase with N and \nobviously does not reach its combinatorial limit. \n \n1400\n \n \n \n \n \n \n \n15000\nFigure 3: A balanced network maximally loaded with HCAs. Left: A raster plot of a \nmaximally loaded DBN. P=408, NE=6,000. At time t=450ms, the seventh pattern is \nignited for a duration of 10ms, leading to termination of another pattern's activity \n(upper stripe) and to sustained activity of the ignited pattern (lower stripe). Right: \nP(NE) as inferred from simulations of a SBN (\"o\") and of a DBN (\"*\"). The DBN \nrealizes the combinatorial limit (dashed line) whereas the SBN does not realize its \nlimit (solid line). From this comparison it is clear that DBN is superior to the SBN \nin terms of network capacity. \n\nDBN \nSBN \nDBN Upper Limit\nSBN Upper Limit\n\n10000\n\n1200\n\n1000\n\n5000\n\n0\n0\n\n400\n\n800\n\nx\na\nm\n\nP\n\nNE\n\n600\n\n200\n\n\f \n\nThe simulations displayed in Figure 3 show that in the DBN the combinatorial P is \nindeed realized, and the capacity of this DBN grows like 0.06NE. In the SBN, \ndynamic interference prevents reaching the combinatorial limit. \nWe have tried, in many ways, to increase the capacity of SBN. Recently, we have \ndiscovered [11] that only if the external rates are appropriately scaled, then SBN \ncapacity can be linear with NE with a pre-factor \u03b1 almost as high as that of a DBN. \nAlthough under these conditions SBNs can have large capacity, we emphasize that \nDBNs posses a clear advantage. Their structure guarantees high capacity under more \ngeneral conditions. \n\n4 Discussion \n\nFigure 2\n\nIn this paper we study memory patterns embedded in a balanced network of spiking \nneurons. In particular, we focus on the maximal capacity of Hebbian cell \nassemblies. Requiring stability of the asynchronous state of the network, that serves \nas the background for memory activity, and further assuming that the neuronal \nspiking process is noise-driven, we show that naively applying Hebb's architecture \nleads to global oscillations. We propose the double-balance principle as the solution \nto this problem. This double-balance is obtained by introducing shadow patterns, i.e. \ninhibitory patterns that are associated with the excitatory ones and fed by them, but \ndo not have specific connectivity other than that. \nThe maximal load of our system is determined in terms of the available synaptic \nresources, and is proportional to the size of the excitatory population, NE. For the \nparameters used here it turns out to be P=0.06NE. This limit was estimated by a \ncombinatorial argument of synaptic availability, and shown to be realized by \nsimulations. \nSynfire chains were also studied. DBNs allow for their embedding in relatively \nsmall networks, as shown in \n. Previous studies have shown that their \nembedding in balanced networks without shadow pools require network sizes larger \nby an order of magnitude [9]. The capacity P of a SFC is defined, in analogy with \nthe HCA case, as the number of pools embedded in the network. In this case we \ncannot realize the theoretical limit in simulations. We believe that the feed-forward \nstructure of the SFC, which is absent in HCA, introduces further dynamical \ninterference. The feed-forward structure can amplify correlations and firing rates \nmore efficiently than the feedback structure within patterns of the HCA. Thus a \nnetwork embedded with SFCs may be more sensitive to spontaneously evolved \ncorrelations than a network embedded with HCAs. \nIt is interesting to note that the addition of shadow patterns has an analogy in the \nHopfield model [5], where neurons in a pattern have both excitatory and inhibitory \ncouplings with the rest of the network. One may claim that the architecture proposed \nhere recovers the same effect via the shadow patterns. Accommodating the Hopfield \nmodel in networks of spiking neurons was tried before [3, 4] without specific \nemphasis on the question of capacity. In Gerstner and van Hemenn [4] the synaptic \nmatrix is constructed in the same way as in the Hopfield model, i.e. neurons can \nhave excitatory and inhibitory synapses. In [3, 8] the synaptic bonds of the Hopfield \nmodel were replaced by strong excitatory connections within a pattern, and weak \nexcitatory connections among neurons in a patterns and those outside the pattern. \nWhile the different types of connection are of different magnitude, they are all \nexcitatory. In contrast, here, excitation exists within a pattern as well as outside it, \nbut the pattern has a well-defined inhibitory effect on the rest of the network, \nmediated by the shadow pattern. The resulting inhibitory correlated currents cancel \nthe excitatory correlated input. Since the firing process in a BN is driven by \n\n\f \n\n12.0=\u03b1\n\n6.0=\u03b1\n\nfluctuations, it seems that negating excitatory correlations by inhibitory ones is \nmore akin to Hopfield's construction in a network of two populations. \nHertz [12] has argued that a capacity limit obtained in a network of integrate-and-\n2/\u03c4 to compare it with a network of binary \nfire neurons should be multiplied by \nneurons. Hence the \n in a binary \nmodel. It is not surprising that the last number is higher than 0.14, the limit of the \noriginal Hopfield model, since our model is sparse, as, e.g. the Tsodyks-Feigelman \n[7] model, where larger capacities were achieved. \nFinally, let us point out again that whereas only DBNs can reach the combinatorial \ncapacity limit under the conditions specified in this paper, we have recently \ndiscovered [11] that SBN can also reach this limit if additional scaling conditions \nare imposed on the input. The largest capacities that we obtained under these \nconditions were of order 0.1. \n\n obtained here, is equivalent to \n\nAcknowledgments \nThis work was supported in part by grants from GIF. \n\nReferences \n1. \n\n2. \n\n3. \n\n4. \n\n5. \n\n6. \n\n7. \n\n8. \n\n9. \n\n10. \n\n11. \n\n12. \n\nBrunel, N., Dynamics of sparsely connected networks of excitatory and \ninhibitory spiking neurons. J Comput Neurosci, 2000. 8(3): p. 183-208. \nvan Vreeswijk, C. and H. Sompolinsky, Chaotic balanced state in a model \nof cortical circuits. Neural Comput, 1998. 10(6): p. 1321-71. \nAmit, D., J and N. Brunel, Dynamics of a recurrent network of spiking \nneurons before and following learning. Network, 1997. 8: p. 373. \nGerstner, W. and L. van Hemmen, Associative memory in a network of \n'spiking' neurons. Network, 1992. 3: p. 139-164. \nHopfield, J.J., Neural networks and physical systems woth emergant \ncollective computational abilities. PNAS, 1982. 79: p. 2554-58. \nWillshaw, D.J., O.P. Buneman, and H.C. Longuet-Higgins, Non-\nholographic associative memory. Nature (London), 1969. 222: p. 960-962. \nTsodyks, M.V. and M.V. Feigelman, The enhanced storage capacity in \nneural networks with low activity level. Europhys. Let., 1988. 6(2): p. 101. \nBrunel, N. and X.-J. Wang, Effects of neuromodulation in a cortical \nnetwork model of object working memory dominated by recurrent \ninhibition. J. of Computational Neuroscience, 2001. 11: p. 63-85. \nAviel, Y., et al., On embedding synfire chains in a balanced network. \nNeural Computation, 2003. 15(6): p. 1321-1340. \nTuckwell, H.C., Introduction to theoretical neurobiology. 1988, \nCambridge: Cambridge University Press. \nAviel, Y., D. Horn, and M. Abeles, Memory Capacity of Balanced \nNetworks. 2003: Submitted. \nHertz, J.A., Modeling synfire networks, in Neuronal Information processing \n- From Biological Data to Modelling and Application, G. Burdet, P. \nCombe, and O. Parodi, Editors. 1999. \n\n\f", "award": [], "sourceid": 2441, "authors": [{"given_name": "Yuval", "family_name": "Aviel", "institution": null}, {"given_name": "David", "family_name": "Horn", "institution": null}, {"given_name": "Moshe", "family_name": "Abeles", "institution": null}]}