{"title": "Information Capacity and Robustness of Stochastic Neuron Models", "book": "Advances in Neural Information Processing Systems", "page_first": 178, "page_last": 184, "abstract": null, "full_text": "Information Capacity and Robustness of \n\nStochastic Neuron Models \n\nElad Schneidman \n\nIdan Segev N aftali Tishby \n\nInstitute of Computer Science, \nDepartment of Neurobiology and \nCenter for Neural Computation, \n\nHebrew University \n\nJerusalem 91904, Israel \n\n{ elads, tishby} @cs.huji.ac.il, idan@lobster.ls.huji.ac.il \n\nAbstract \n\nThe reliability and accuracy of spike trains have been shown to \ndepend on the nature of the stimulus that the neuron encodes. \nAdding ion channel stochasticity to neuronal models results in a \nmacroscopic behavior that replicates the input-dependent reliabili(cid:173)\nty and precision of real neurons. We calculate the amount of infor(cid:173)\nmation that an ion channel based stochastic Hodgkin-Huxley (HH) \nneuron model can encode about a wide set of stimuli. We show that \nboth the information rate and the information per spike of the s(cid:173)\ntochastic model are similar to the values reported experimentally. \nMoreover, the amount of information that the neuron encodes is \ncorrelated with the amplitude of fluctuations in the input, and less \nso with the average firing rate of the neuron. We also show that for \nthe HH ion channel density, the information capacity is robust to \nchanges in the density of ion channels in the membrane, whereas \nchanging the ratio between the Na+ and K+ ion channels has a \nconsiderable effect on the information that the neuron can encode. \nFinally, we suggest that neurons may maximize their information \ncapacity by appropriately balancing the density of the different ion \nchannels that underlie neuronal excitability. \n\n1 \n\nIntroduction \n\nThe capacity of neurons to encode information is directly connected to the nature \nof spike trains as a code. Namely, whether the fine temporal structure of the spik~ \ntrain carries information or whether the fine structure of the train is mainly noise \n(see e.g. [1, 2]). Experimental studies show that neurons in vitro [3, 4] and in vivo \n[5, 6, 7], respond to fluctuating inputs with repeatable and accurate spike trains, \nwhereas slowly varying inputs result in lower repeatability and 'jitter' in the spike \ntiming. Hence, it seems that the nature of the code utilized by the neuron depends \non the input that it encodes [3, 6]. \nRecently, we suggested that the biophysical origin of this behavior is the stochas-\n\n\fCapacity and Robustness oJStochastic Neuron Models \n\n179 \n\nticity of single ion channels. Replacing the average conductance dynamics in the \nHodgkin-Huxley (HH) model [8], with a stochastic channel population dynamics \n[9, 10, 11], yields a stochastic neuron model which replicates rather well the spike \ntrains' reliability and precision of real neurons [12]. The stochastic model also shows \nsubthreshold oscillations, spontaneous and missing spikes, all observed experimen(cid:173)\ntally. Direct measurement of membranal noise has also been replicated successfully \nby such stochastic models [13]. Neurons use many tens of thousands of ion channels \nto encode the synaptic current that reaches the soma into trains of spikes [14]. The \nnumber of ion channels that underlies the spike generation mechanism, and their \ntypes, depend on the activity of the neuron [15, 16]. It is yet unclear how such \nchanges may affect the amount and nature of the information that neurons encode. \nHere we ask what is the information encoding capacity of the stochastic HH mod(cid:173)\nel neuron and how does this capacity depend on the densities of different of ion \nchannel types in the membrane. We show that both the information rate and the \ninformation per spike of the stochastic HH model are similar to the values reported \nexperimentally and that neurons encode more information about highly fluctuat(cid:173)\ning inputs. The information encoding capacity is rather robust to changes in the \nchannel densities of the HH model. Interestingly, we show that there is an optimal \nchannel population size, around the natural channel density of the HH model. The \nencoding capacity is rather sensitive to changes in the distribution of channel types, \nsuggesting that changes in the population ratios and adaptation through channel \ninactivation may change the information content of neurons. \n\n2 The Stochastic HH Model \n\nThe stochastic HH (SHH) model expands the classic HH model [8], by incorporating \nthe stochastic nature of single ion channels [9, 17]. Specifically, the membrane \nvoltage dynamics is given by the HH description, namely, \n\ndV \n\ncmTt = -gLCV - VL) - gK(V, t)(V - VK) - gNa(V, t)(V - VNa) + I \n\n(1) \n\nwhere V is the membrane potential, VL, VK and VNa are the reversal potentials of \nthe leakage, potassium and sodium currents, respectively, gL, gK(V, t) and gNa(V, t) \nare the corresponding ion conductances, Cm is the membrane capacitance and I is \nthe injected current. The ion channel stochasticity is introduced by replacing the \nequations describing the ion channel conductances with explicit voltage-dependent \nMarkovian kinetic models for single ion channels [9, 10]. Based on the activation \nand inactivation variables of the deterministic HH model, each K+ channel can be \nin one of five different states, and the rates for transition between these states are \ngiven in the following diagram, \n\n[ \n~ ~ ~ ~ ~ ~ ~ ~ ~ \n\n] \n\n[ \n\n] \n\n[ \n\n] \n\n[ \n\n] \n\n[ \n\n] \n\nan \n4f3n \n\n4Qn \nf3n \n\n3an \n2f3n \n\n2an \n3f3n \n\n(2) \nwhere [nj] refers to the number of channels which are currently in the state nj. \nHere [n4] labels the single open state of a potassium channel, and an, i3n, are the \nvoltage-dependent rate-functions in the HH formalism. A similar_model is used for \nthe Na+ channel (The Na+ kinetic model has 8 states, with only one open state, \nsee [12] for details). \nThe potassium and sodium membrane conductances are given by, \ngNa(V, t) = ,Na [mahl] \n\nwhere ,K and ,Na are the conductances of an ion channel for the K+ and Na+ re(cid:173)\n\ngK(V, t) = ,K [114] \n\nspectively. We take the conductance of a single channel to be 20pS [14] for both the \n\n(3) \n\n\f180 \n\nE. Schneidman. l. Segev and N. Tishby \n\nK+ and Na+ channel types 1. Each of the ion channels will thus respond stochas(cid:173)\ntically by closing or opening its 'gates' according to the kinetic model, fluctuating \naround the average expected behavior. Figure 1 demonstrates the effect of the ion \n\nA \n\nB \n\nFigure 1: Reliability of firing patterns in a model of an isopotential Hodgkin-Huxley \nmembrane patch in response to different current inputs. (A) Injecting a slowly changing \ncurrent input (low-pass Gaussian white noise with a mean TJ = 8I1A/cm 2 , and standard \ndeviation a = 1 p,A/ cm2 which was convolved with an 'alpha-function' with a time constant \nTo = 3 msec, top frame), results in high 'jitter' in the timing of the spikes (raster plots \nof spike responses, bottom frame). (B) The same patch was again stimulated repeatedly, \nwith a highly fluctuating stimulus (TJ = 8 p,A/cm2 , a = 7 p,A/cm2 and To = 3 msec, top \nframe) The 'jitter' in spike timing is significantly smaller in B than in A (i.e. increased \nreliability for the fluctuating current input). Patch area used was 200 p,m2 , with 3,600 K+ \nchannels and 12,000 Na+ channels. \n(C) Average firing \nrate in response to DC current input of both the HH and the stochastic HH model. (D) \nCoefficient of variation of the inter spike interval of the SHH model in response to DC \ninputs, giving values which are comparable to those observed in real neurons \n\n(Compare to Fig.l in see [3]). \n\nchannel stochasticity, showing the response of a 200 J.Lm2 SHH isopotential mem(cid:173)\nbrane patch (with the 'standard' SHH channel densities) to repeated presentation \nof supra threshold current input. When the same slowly varying input is repeatedly \npresented (Fig. lA) , the spike trains are very different from each other, i.e. , spike \nfiring time is unreliable. On the other hand, when the input is highly fluctuat(cid:173)\ning (Fig. IB), the reliability of the spike timing is relatively high. The stochastic \nmodel thus replicates the input-dependent reliability and precision of spike trains \nobserved in pyramidal cortical neurons [3] . As for cortical neurons, the Repeatability \nand Precision of the spike trains of the stochastic model (defined in [3]) are strongly \ncorrelated with the fluctuations in the current input and may get to sub-millisecond \nprecision [12]. The f-I curve of the stochastic model (Fig. lC) and the coefficient of \nvariation (CV) of the inter-spike intervals (lSI) distribution for DC inputs (Fig. ID) \nare both similar to the behavior of cortical neurons in vivo [18], in clear contrast to \nthe deterministic model 2 \n\n1 The number of channels is thus the ratio between the total conductance of a single type \nof ion channels and the single channel conductance, and so the 'standard' SHH densities \nwill be 60 Na+ and 18 Na+ channels per p,m 2 . \n\n2 Although the total number of channels in the model is very large, the microscopic level \nion channel noise has a macroscopic effect on the spike train reliability, since the number \n\n\fCapacity and Robustness of Stochastic Neuron Models \n\n181 \n\n3 The Information Capacity of the SHH Neuron \n\nExpanding the Repeatability and Precision measures [3], we turn to quantify how \nmuch information the neuron model encodes about the stimuli it receives. We thus \npresent the model with a set of 'representative' input current traces, and the amount \nof information that the respective spike trains encode is calculated. \nFollowing Mainen and Sejnowski [3], we use a set of input current traces which imi(cid:173)\ntate the synaptic current that reaches the soma from the dendritic tree. We convolve \na Gaussian white noise trace (with a mean current 1} and standard deviation 0') with \nan alpha function (with a To: = 3 msec). Six different mean current values are used \n(1} = 0,2,4,6,8,10 pA/cm2 ) , and five different std values (0' = 1,3,5,7, 9pA/cm2 ), \nyielding a set of 30 input current traces (each is 10 seconds long). This set of inputs \nis representative of the wide variety of current traces that neurons might encounter \nunder in vivo conditions in the sense that the average firing rates for this set of \ninputs which range between 2 - 70 Hz (not shown). \nWe present these input traces to the model, and calculate the amount of information \nthat the resulting spike trains convey about each input, following [6, 19]. Each \ninput is presented repeatedly and the resulting spike trains are discretized in D..T \nbins, using a sliding 'window' of size T along the discretized sequence. Each train \nof spikes is thus transformed into a sequence of K-letter 'words' (K = T/D..T) , \nconsisting of O's (no spike) and l's (spike). We estimate P(W), the probability of \nthe word W to appear in the spike trains, and then compute the entropy rate of its \ntotal word distribution, \n\nHtotal = - L P(W) log2 P(W) \n\nW \n\nbits/word \n\n(4) \n\nwhich measures the capacity of information that the neuron spike trains hold [20, 6, \n19]. We then examine the set of words that the neuron model used at a particular \ntime t over all the repeated presentations of the stimulus, and estimate P(Wlt), \nthe time-dependent word probability distribution. At each time t we calculate the \ntime-dependent entropy rate, and then take the average of these entropies \n\nHnoise = (- LP(Wlt)lOg2 P(Wlt))t \n\nw \n\nbits/word \n\n(5) \n\nwhere ( .. . )t denotes the average over all times t. Hnoise is the noise entropy rate, \nwhich measures how much of the fine structure of the spike trains of the neuron is \njust noise. After performing the calculation for each of the inputs, using different \nword sizes 3, we estimate the limit of the total entropy and noise entropy rates at \nT --* 00, where the entropies converge to their real values (see [19] for details) . \nFigure 2A shows the total entropy rate of the responses to the set of stimuli, ranging \nfrom 10 to 170 bits/sec. The total entropy rate is correlated with the firing rates of \nthe neuron (not shown). The noise entropy rate however, depends in a different way \non the input parameters: Figure 2B shows the noise entropy rate of the responses \nto the set of stimuli, which may get up to 100 bits/sec. Specifically, for inputs with \nhigh mean current values and low fluctuation amplitude, many of the spikes are \n\nof ion channels which are open near the spike firing threshold is rather small [12). The \nfluctuations in this small number of open channels near firing threshold give rise to the \ninput-dependent reliability of the spike timing. \n\n3t he bin size T = 2 msec has been set to be small enough to keep the fine tem(cid:173)\n\nporal structure of the spike train within the word sizes used, yet large enough to avoid \nundersampling problems \n\n\f182 \n\nE. Schneidman, /. Segev and N. Tishby \n\njust noise, even if the mean firing rate is high. The difference between the neuron's \nentropy rate (the total capacity of information of the neuron's spike train) and the \nnoise entropy rate, is exactly the average rate of information that the neuron's spike \ntrains encode about the input, I(stimulus , spike train) = Htotal - Hnoise [20, 6], \nthis is shown in Figure 2C. The information rate is more sensitive to the size of \n\nA \n\n200 B \n\n.~: \n\n150 \n\n100 \n\n50 \n\n0 \n\n10 \n\n0 0 \n\na fl.LA/cm2 ) \n\n10 \n\n'1 fl.LA/cm 2) \n\n0 0 \n\na fl.LA/cm 2) \n\n200 \n\n;f. ':'~ \n\n150 \n\n100 \n\n50 \n\n0 \n\n3 \n\n~;) \n\n2 .5 \n\n:~ \n\n,. 100 D \n;;; \n80 a 3 \n\n. \n\n)~r\n\n. \n\n.~ \n\n2 \n\n~ \n\n40 ~ 1 \n\n~ ~ 60 \n20 i 0 \n\n0 .5 \n\n2 \n\n1.5 \n\n10 \n\n10 \n\n5 \n\n10 \n\n5 \na lllAIcm2 ) \n\n0 0 \n\n0 \n\n'1 [llAIcm 2 ) \n\n0 0 \n\n5 \na lllAIcm2) \n\nFigure 2: Information capacity of the SHH model. (A) The total spike train entropy \nrate of the SHH model as a function of 'TI, the current input mean, and a, the standard \ndeviation (see text for details). Error bar values of this surface as well as for the other \nframes range between 1 - 6% (not shown). (B) Noise entropy rate as a function of the \ncurrent input parameters. (C) The information rate about the stimulus in the spike trains, \nas a function of the input parameters, calculated by subtracting noise entropy from the \ntotal entropy (note the change in grayscale in C and D). (D) Information per spike as a \nfunction of the input parameters, which is calculated by normalizing the results shown in \nC by the average firing rate of the responses to each of the inputs. \n\nfluctuations in the input than to the mean value of the current trace (as expected, \nfrom the reliability and precision of spike timing observed in vitro [3] and in vivo \n[6] as well as in simulations [12]). The dependence of the neural code on the input \nparameters is better reflected when calculating the average amount of information \nper spike that the model gives for each of the inputs (Fig. 2D) (see for comparison \nthe values for the Fly's HI neuron [6]). \n\n4 The effect of Changing the Neuron Parameters on the \n\nInformation Capacity \n\nIncreasing the density of ion channels in the membrane compared to the 'standard' \nSHH densities, while keeping the ratio between the K+ and Na+ channels fixed, \nonly diminishes the amount of information that the neuron encodes about any of \nthe inputs in the set. However, the change is rather small: Doubling the channel \ndensity decreases the amount of information by 5 - 25% (Fig. 3A), depending on \nthe specific input. Decreasing the channel densities of both types, results in en(cid:173)\ncoding more information about certain stimuli and less about others. Figure 3B \nshows that having half the channel densities would result with in 10% changes in \nthe information in both directions. Thus, the information rates conveyed by the s(cid:173)\ntochastic model are robust to changes in the ion channel density. Similar robustness \n(not shown) has been observed for changes in the membrane area (keeping channel \n\n\fCapacity and Robustness of Stochastic Neuron Models \n\n183 \n\ndensity fixed) and in the temperature (which effects the channel kinetics). However, \n\nA \njl.2 \n:5 1 \n\n~ ..s0.8 \n\n10 \n\nc \n\n10 \n\n5 \n\no 0 \n\na[~em2) \n\n1.2 B \n\n....... , \n\n1 .1 .21.2 \n\n1 \"1 \n\n~ \n\n'1 \n\n0.9 ~o.s \n10 \n\nO.S \n\n5 \n'1 [pAIem2) \n\n$A D \n\n3 .5 \n\n3 \n\n2.5 \n\n2 \n1.5 \n\n0 .5 \n\n1.2 \n\n1.1 \n\n0 .9 \n\nO.S \n\n0.4 \n\n>~ \n\n0.3 \n\n0.2 \n\n0 .1 \n\n0 \n\n10 \n\n5 \n\no 0 \n\na[~Alem2) \n\n10 \n\no 0 \n\na[~Alem2) \n\nFigure 3: The effect of changing the ion channel densities on the information capacity. \n(A) The ratio of the information rate of the SHH model with twice the density of the \n'standard' SHH densities divided by the information rate of the mode with 'standard' \nSHH densities. (B) As in A, only for the SHH model with half the 'standard' densities. \n(C) The ratio of the info rate of the SHH model with twice as many Na+ channels, divided \nby the info rate of the standard SHH Na+ channel density, where the K+ channel density \nremains untouched (note the change in graycale in C and D). (D) As in C, only for the \nSHH model with the number of Na+ channels reduced by half. \n\nchanging the density of the Na+ channels alone has a larger impact on the amount \nof information that the neuron conveys about the stimuli. Increasing Na+ channel \ndensity by a factor of two results in less information about most of the stimuli, and \na gain in a few others (Fig. 3C). However, reducing the number of Na+ channels by \nhalf results in drastic loss of information for all of the inputs (Fig. 3D). \n\n5 Discussion \n\nWe have shown that the amount of information that the stochastic HH model en(cid:173)\ncodes about its current input is highly correlated with the amplitude of fluctuations \nin the input and less so with the mean value of the input. The stochastic HH \nmodel, which incorporates ion channel noise, closely replicates the input-dependent \nreliability and precision of spike trains observed in cortical neurons. The informa(cid:173)\ntion rates and information per spike are also similar to those of real neurons. As \nin other biological systems (e.g., [21]), we demonstrate robustness of macroscopic \nperformance to changes in the cellular properties - the information coding rates of \nthe SHH model are robust to changes in the ion channels densities as well as in \nthe area of the excitable membrane patch and in the temperature (kinetics) of the \nchannel dynamics. However, the information coding rates are rather sensitive to \nchanges in the ratio between the densities of different ion channel types, suggests \nthat the ratio between the density of the K+ channels and the Na+ channels in the \n'standard' SHH model may be optimal in terms of the information capacity. This \nmay have important implications on the nature of the neural code under adaptation \nand learning. We suggest that these notions of optimality and robustness may be \na key biophysical principle of the operation of real neurons. Further investigation(cid:173)\ns should take into account the activity-dependent nature of the channels and the \n\n\f184 \n\nE. Schneidman, I. Segev and N. nshby \n\nneuron [15, 16] and the notion of local learning rules which could modify neuronal \nand suggest loca1learning rules as in [22]. \n\nAcknowledgements \n\nThis research was supported by a grant from the Ministry of Science, Israel. \n\nReferences \n[1] Rieke F ., Warland D., de Ruyter van Steveninck R., and Bialek W. Spike: Exploring \n\nthe Neural Code. MIT Press, 1997. \n\n[2] Shadlen M. and Newsome W. Noise, neural codes and cortical organization. Curro \n\nOpin. Neurobiol., 4:569-579, 1994. \n\n[3] Mainen Z. and Sejnowski T . Reliability of spike timing in neocortical neurons. Science, \n\n268:1503-1508, 1995. \n\n[4] Nowak L., Sanches-Vives M., and McCormick D. Influence of low and high frequency \n\ninputs on spike timing in visual cortical neurons. Cerebral Cortex, 7:487-501, 1997. \n\n[5] Bair W. and Koch C. Temporal precision of spike trains in extrastriate cortex of the \n\nbehaving macaque monkey. Neural Comp., 8:1185-1202, 1996. \n\n[6] de Ruyter van Steveninck R., Lewen G., Strong S., Koberle R., and Bialek W. Re(cid:173)\n\nproducibility and variability in neural spike trains. Science, 275:1805-1808, 1997. \n\n[7] Reich D., Victor J., Knight B., Ozaki T., and Kaplan E. Response variability and \ntiming precision of neuronal spike trains in vivo. J. Neurophysiol., 77:2836:2841, 1997. \n[8] Hodgkin A. and Huxley A. A quantitative description of membrane current and its \n\napplication to conduction and excitation in nerve. J. Physiol., 117:500-544, 1952. \n\n[9] Fitzhugh R . A kinetic model of the conductance changes in nerve membrane. J. Cell. \n\nComp o Physiol., 66:111-118, 1965. \n\n[10] DeFelice L. Introduction to Membrane Noise. Perseus Books, 1981. \n[11] Skaugen E. and Wall0e L. Firing behavior in a stochastic nerve membrane model \nbased upon the Hodgkin-Huxley equations. Acta Physiol. Scand., 107:343-363, 1979. \n[12] Schneidman E ., Freedman B., and Segev I. Ion channel stochasticity may be critical in \ndetermining the reliability and precision of spike timing. Neural Comp., 10:1679-1704, \n1998. \n\n[13] White J., Klink R., Alonso A., and Kay A. Noise from voltage-gated channels may \ninfluence neuronal dynamics in the entorhinal cortex. J Neurophysiol, 80:262-9, 1998. \n\n[14] Hille B. Ionic Channels of Excitable Membrane. Sinauer Associates, 2nd ed., 1992. \n[15] Marder E., Abbott L., Turrigiano G., Liu Z., and Golowasch J . Memory from the \n\ndynamics of intrinsic membrane currents. Proc. Natl. Acad. Sci., 93:13481-6, 1996. \n\n[16] Toib A., Lyakhov V., and Marom S. Interaction between duration of activity and rate \nof recovery from slow inactivation in mammalian brain Na+ channels. J Neurosci., \n18:1893-1903, 1998. \n\n[17] Strassberg A. and DeFelice L. Limits of the HH formalism: Effects of single channel \n\nkinetics on transmembrane voltage dynamics. Neural Comp., 5:843-856, 1993. \n\n[18] Softky W. and Koch C. The highly irregular firing of cortical cells is inconsistent with \n\ntemporal integration of random EPSPs. J. Neurosci., 13:334-350, 1993. \n\n[19] Strong S., Koberle R., de Ruyter van Steveninck R., and Bialek W. Entropy and \n\ninformation in neural spike trains. Phys. Rev. Lett., 80:197-200, 1998. \n\n[20] Cover T.M. and Thomas J.A. Elements of Information Theory. Wiley, 1991. \n[21] Barkai N. and Leibler S. Robustness in simple biochemical networks. Nature, 387:913-\n\n917, 1997. \n\n[22] Stemmler M. and Koch C. How voltage-dependent conductances can adapt to maxi(cid:173)\nmize the information encoded by neuronal firing rate. Nat. Neurosci., 2:521-7, 1999. \n\n\f", "award": [], "sourceid": 1657, "authors": [{"given_name": "Elad", "family_name": "Schneidman", "institution": null}, {"given_name": "Idan", "family_name": "Segev", "institution": null}, {"given_name": "Naftali", "family_name": "Tishby", "institution": null}]}