{"title": "The Electrotonic Transformation: a Tool for Relating Neuronal Form to Function", "book": "Advances in Neural Information Processing Systems", "page_first": 69, "page_last": 76, "abstract": "", "full_text": "The Electrotonic Transformation: \n\na Tool for Relating Neuronal Form to Function \n\nNicholas T. Carnevale \nDepartment of Psychology \n\nYale University \n\nNew Haven, CT 06520 \n\nKenneth Y. Tsai \n\nDepartment of Psychology \n\nYale University \n\nNew Haven, CT 06520 \n\nBrenda J. Claiborne \n\nDivision of Life Sciences \n\nUniversity of Texas \n\nSan Antonio, TX 79285 \n\nThomas H. Brown \n\nDepartment of Psychology \n\nYale University \n\nNew Haven, CT 06520 \n\nAbstract \n\nThe spatial distribution and time course of electrical signals in neurons \nhave important theoretical and practical consequences. Because it is \ndifficult to infer how neuronal form affects electrical signaling, we \nhave developed a quantitative yet intuitive approach to the analysis of \nelectrotonus. This approach transforms the architecture of the cell \nfrom anatomical to electrotonic space, using the logarithm of voltage \nattenuation as the distance metric. We describe the theory behind this \napproach and illustrate its use. \n\n1 INTRODUCTION \n\nThe fields of computational neuroscience and artificial neural nets have enjoyed a \nmutually beneficial exchange of ideas. This has been most evident at the network level, \nwhere concepts such as massive parallelism, lateral inhibition, and recurrent excitation \nhave inspired both the analysis of brain circuits and the design of artificial neural net \narchitectures. \n\nLess attention has been given to how properties of the individual neurons or processing \nelements contribute to network function. Biological neurons and brain circuits have \n\n\f70 \n\nNicholas Carnevale, Kenneth Y. Tsai, Brenda J. Claiborne, Thomas H. Brown \n\nbeen simultaneously subject to eons of evolutionary pressure. This suggests an essential \ninterdependence between neuronal form and function, on the one hand, and the overall \narchitecture and operation of biological neural nets, on the other. Therefore reverse(cid:173)\nengineering the circuits of the brain appears likely to reveal design principles that rely \nupon neuronal properties. These principles may have maximum utility in the design of \nartificial neural nets that are constructed of processing elements with greater similarity \nto biological neurons than those which are used in contemporary designs. \n\nSpatiotemporal extent is perhaps the most obvious difference between real neurons and \nprocessing elements. The processing element of most artificial neural nets is essentially \na point in time and space. Its activation level is the instantaneous sum of its synaptic \ninputs. Of particular relevance to Hebbian learning rules, all synapses are exposed to \nthe same activation level. These simplifications may insure analytical and implementa(cid:173)\ntional simplicity, but they deviate sharply from biological reality. Membrane potential, \nthe biological counterpart of activation level, is neither instantaneous nor spatially \nuniform. Every cell has finite membrane capacitance, and all ionic currents are finite, \nso membrane potential must lag behind synaptic inputs. Furthermore, membrane \ncapacitance and cytoplasmic resistance dictate that membrane potential will almost \nnever be uniform throughout a living neuron embedded in the circuitry of the brain. \nThe combination of ever-changing synaptic inputs with cellular anatomical and \nbiophysical properties guarantees the existence of fluctuating electrical gradients. \n\nConsider the task of building a massively parallel neural net from processing elements \nwith such \"nonideal\" characteristics. Imagine moreover that the input surface of each \nprocessing element is an extensive, highly branched structure over which approximately \n10,000 synaptic inputs are distributed. It might be tempting to try to minimize or work \naround the limitations imposed by device physics. However, a better strategy might be \nto exploit the computational consequences of these properties by making them part of \nthe design, thereby turning these apparent weaknesses into strengths. \n\nTo facilitate an understanding of the spatiotemporal dynamics of electrical signaling in \nneurons, we have developed a new theoretical approach to linear electrotonus and a new \nway to make practical use of this theory. We present this method and illustrate its \napplication to the analysis of synaptic interactions in hippocampal pyramidal cells. \n\n2 THEORETICAL BACKGROUND \n\nOur method draws upon and extends the results of two prior approaches: cable theory \nand two-port analysis. \n\n2.1 CABLE THEORY \n\nThe modern use of cable theory in neuroscience began almost four decades ago with the \nwork of RaIl (1977). Much of the attraction of cable theory derives from the conceptual \nsimplicity of the steady-state decay of voltage with distance along an infinite cylindrical \ncable: V(x) = Voe-xl). where x is physical distance and .4 is the length constant. This \nexponential relationship makes it useful to define the electrotonic distance X as the \n\n\fThe Electronic Transfonnation: A Tool for Relating Neuronal Fonn to Function \n\n7l \n\nlogarithm of the signal attenuation: X = lnVo/V(x). In an infinite cylindrical cable, \nelectrotonic distance is directly proportional to physical distance: X = x/2 . \nHowever, cable theory is difficult to apply to real neurons since dendritic trees are \nneither infinite nor cylindrical. Because of their anatomical complexity and irregular \nvariations of branch diameter and length, attenuation in neurons is not an exponential \nfunction of distance. Even if a cell met the criteria that would allow its dendrites to be \nreduced to a finite equivalent cylinder (RaIl 1977), voltage attenuation would not bear a \nsimple exponential relationship to X but instead would vary inversely with a hyperbolic \nfunction (Jack et a!. 1983). \n\n2.2 TWO-PORT THEORY \n\nBecause of the limitations and restrictions of cable theory, Carnevale and Johnston \n(1982) turned to two-port analysis. Among their conclusions, three are most relevant to \nthis discussion. \n\nFigure 1: Attenuation is direction-dependent. \n\n1) \n\nThe first is that signal attenuation depends on the direction of signal propagation. \nSuppose that i and J are two points in a cell where i is \"upstream\" from J (voltage is \nspreading from i to J), and define the voltage attenuation from i to j: A~ = ~ IV). Next \nsuppose that the direction of signal propagation is reversed, so that j is now upstream \nfrom i, and define the voltage attenuation A ~ = Vj I~. In general these two \nattenuations will not be equal: A~ *- AV. \nThey also showed that voltage attenuation in one direction is identical to current \nattenuation in the opposite direction (Carnevale and Johnston 1982). Suppose current Ii \nenters the cell at i, and the current that is captured by a voltage clamp at J is Ii' and \ndefine the current attenuation A; = Ii 11 j' Because of the directional reciprocity \nbetween current and voltage attenuation, A! = AV. Similarly, if we interchange the \ncurrent entry and voltage clamp sites, the current attenuation ratio would be A l = A ~ . \nFinally, they found that charge and DC current attenuation in the same direction are \nidentical (Carnevale and Johnston 1982). Therefore the spread of electrical signals \nbetween any two points is completely characterized by the voltage attenuations in both \ndirections. \n\nj1 \n\n)1 \n\n1) \n\n)1 \n\n1) \n\n\f72 \n\nNicholas Carnevale, Kenneth Y. Tsai, Brenda 1. Claiborne, Thomas H. Brown \n\n2.3 THE ELECTROTONIC TRANSFORMATION \n\nThe basic idea of the electrotonic transformation is to remap the cell from anatomical \nspace into \"electrotonic space,\" where the distance between points reflects the \nattenuation of an electrical signal spreading between them. Because of the critical role \nof membrane potential in neuronal function, it is usually most appropriate to deal with \nvoltage attenuations. \n\n2.3.1 The Distance Metric \n\nWe use the logarithm of attenuation between points as the distance metric in electrotonic \nspace: Li; = InAij (Brown et a1. 1992, Zador et a1. 1991). To appreciate the utility of \nthis definition, consider voltage spreading from point i to point j , and suppose that k is \non the direct path between i and j. The voltage attenuations are AI~ = V,/~ , \nAt = Vk I ~, and A& = V, IV; = A~ At . This last equation and our definition of L \nestablish the additive property of electrotonic distance Llj = Lik + LIg\" \nThat is, \nelectrotonic distances are additive over a path that has a consistent directIon of signal \npropagation. This justifies using the logarithm of attenuation as a metric for the \nelectrical separation between points in a cell. \n\nAt this point several important facts should be noted. First, unlike the electrotonic \ndistance X of classical cable theOI)\" our new definition of electrotonic distance L always \nbears a simple and direct logarithmic relationship to attenuation. Second, because of \nmembrane capacitance, attenuation increases with frequency. Since both steady-state \nand transient signals are of interest, we evaluate attenuations at several different \nfrequencies . \nThird, attenuation is direction-dependent and usually asymmetric. \nTherefore at every frequency of interest, each branch of the cell has two different \nrepresentations in electrotonic space depending on the direction of signal flow. \n\n2.3.2 Representing a Neuron in Electrotonic Space \n\nSince attenuation depends on direction, it is necessary to construct transforms in pairs \nfor each frequency of interest, one for signal spread away from a reference point (Vout) \nand the other for spread toward it (Vin). The soma is often a good choice for the \nreference point, but any point in the cell could be used, and a different vantage point \nmight be more appropriate for particular analyses. \n\nThe only difference between using one point i as the reference instead of any other point \nj is in the direction of signal propagation along the direct path between i and j (dashed \narrows in Figure 2). where Vout relative to i is the same as Vin relative to j and vice \nversa. The directions of signal flow and therefore the attenuations along all other \nbranches of the cell are unchanged. Thus the transforms relative to i and j differ only \nalong the direct path ij, and once the T/~ut and Vin transforms have been created for one \nreference i, it is easy to assemble the transforms with respect to any other reference j. \n\n\fThe Electronic Transformation: A Tool for Relating Neuronal Form to Function \n\n73 \n\n.,. \n.,. \n\nFigure 2: Effect of reference point location on direction of signal propagation. \n\nWe have found two graphical representations of the transform to be particularly useful. \n\"Neuromorphic figures,\" in which the cell is redrawn so that the relative orientation of \nbranches is preserved (Figures 3 and 4), can be readily compared to the original \nintegration and \nanatomy for quick, \ninteractions. For more quantitative analyses, it is helpful to plot electrotonic distance \nfrom the reference point as a function of anatomical distance (Tsai et al. 1993). \n\ninsights regarding synaptic \n\n''big picture\" \n\n3 COMPUTATIONAL METHODS \n\nThe voltage attenuations along each segment of the cell are calculated from detailed, \naccurate morphometric data and the best available experimental estimates of the bio(cid:173)\nphysical properties of membrane and cytoplasm. Any neural simulator like NEURON \n(Hines 1989) could be used to find the attenuations for the DC Vout transform. The DC \nVin attenuations are more time consuming because a separate run must be performed for \neach of the dendritic terminations. However, the AC attenuations impose a severe com(cid:173)\nputational burden on time-domain simulators because many cycles are required for the \nresponse to settle. For example, calculating the DC Vout attenuations in a hippocampal \npyramidal cell relative to the soma took only a few iterations on a SUN Sparc 10-40, but \nmore than 20 hours were required for 40 Hz (Tsai et al. 1994). Finding the full set of \nattenuations for a Vin transform at 40 Hz would have taken almost three months. \nTherefore we designed an O(N) algorithm that achieves high computational efficiency \nby operating in the frequency domain and taking advantage of the branched architecture \nof the cell. In a series of recursive walks through the cell, the algorithm applies Kirch(cid:173)\nhoff's laws to compute the attenuations in each branch. The electrical characteristics of \neach segment of the cell are represented by an equivalent T circuit. Rather than \"lump\" \nthe properties of cytoplasm and membrane into discrete resistances and capacitances, we \ndetermine the elements of these equivalent T circuits directly from complex impedance \nfunctions that we derived from the impulse response of a finite cable (Tsai et a1. 1994). \nSince each segment is treated as a cable rather than an isopotential compartment, the \nresolution of the spatial grid does not affect accuracy, and there is no need to increase \nthe resolution of the spatial grid in order to preserve accuracy as frequency increases. \nThis is an important consideration for hippocampal neurons, which have long mem(cid:173)\nbrane time constants and begin to show increased attenuations at frequencies as low as 2 \n- 5 Hz (Tsai et al. 1994). It also allows us to treat a long unbranched neurite of nearly \nconstant diameter as a single cylinder. \n\nThus runtimes scale linearly with the number of grid points, are independent of \nfrequency, and we can even reduce the number of grid points if the diameters of adjacent \n\n\f74 \n\nNicholas Carnevale. Kenneth Y. Tsai. Brenda 1. Claiborne. Thomas H. Brown \n\nunbranched segments are similar enough. A benchmark of a program that uses our \nalgorithm with NEURON showed a speedup of more than four orders of magnitude \nwithout sacrificing accuracy (2 seconds vs. 20 hours to calculate the Vout attenuations at \n40 Hz in a CAl pyramidal neuron model with 2951 grid points) (Tsai et al. 1994). \n\n4 RESULTS \n\n4.1 DC TRANSFORMS OF A CAl PYRAMIDAL CELL \n\nFigure 3 shows a two-dimensional projection of the anatomy of a rat CA 1 pyramidal \nneuron (cell 524, left) with neuromorphic renderings of its DC Vout and Vin transforms \n(middle and right) relative to the soma. The three-dimensional anatomical data were \nobtained from HRP-filled cells with a computer microscope system as described \nelsewhere (Rihn and Claiborne 1992, Claiborne 1992). The passive electrical properties \nused to compute the attenuations were Ri = 200 Ocm, em = 1 J.lF/cm2 (for nonzero \nfrequencies, not shown here) and Rm = 30 kncm2 (Spruston and Johnston 1992). \n\n524 \n\n1 \n\nFigure 3: CAl pyramidal cell anatomy (cell 524, left) with neuromorphic \n\nrenderings of Vout (middle) and Vin (right) transforms at DC. \n\nThe Vout transform is very compact, indicating that voltage propagates from the soma \ninto the dendrites with relatively little attenuation. The basilar dendrites and the \nterminal branches of the primary apical dendrite are almost invisible, since they are \nnearly isopotential along their lengths. Despite the fact that the primary apical dendrite \nhas a larger diameter than any of its daughter branches, most of the voltage drop for \nsomatofugal signaling is in the primary apical. Therefore it accounts for almost all of \nthe electrotonic length of the cell in the Vout transform. \nThe Vin transform is far more extensive, but most of the electrotonic length of the cell is \nnow in the basilar and terminal apical branches. This reflects the loading effect of \ndownstream membrane on these thin dendritic branches. \n\n4.2 SYNAPTIC INTERACTIONS \n\nThe transform can also give clues to possible effects of electrotonic architecture on \nvoltage-dependent forms of associative synaptic plasticity and other kinds of synaptic \ninteractions. Suppose the cell of Figure 3 receives a weak or \"student\" synaptic input \n\n\fThe Electronic Transformation: A Tool for Relating Neuronal Form to Function \n\n75 \n\nlocated 400 J.lm from the soma on the primary apical dendrite, and a strong or \"teacher\" \ninput is situated 300 J.lm from the soma on the same dendrite. \n\n[!] student \n@ teacher \n\nA. cell 524 \n\nB. cell 503 \n\nFigure 4: Analysis of synaptic interactions. \n\nThe anatomical arrangement is depicted on the left in Figure 4A (\"student\" = square, \n\"teacher\" = circle). The Vin transform with respect to the student (right figure of this \npair) shows that voltage spreads from the teacher to the student synapse with little \nattenuation, which would favor voltage-dependent associative interactions. \n\nFigure 4B shows a different CAl pyramidal cell in which the apical dendrite bifurcates \nshortly after arising from the soma. Two teacher synapses are indicated, one on the \nsame branch as the student and the other on the opposite branch. The Vin transform \nwith respect to the student (right figure of this pair) shows clearly that the teacher \nsynapse on the same branch is closely coupled to the student, but the other is electrically \nmuch more remote and less likely to influence the student synapse. \n\n5. SUMMARY \n\nThe electrotonic transformation is based on a logical, internally consistent conceptual \napproach to understanding the propagation of electrical signals in neurons. \nIn this \npaper we described two methods for graphically presenting the results of the \nneuromorphic rendering, and plots of electrotonic distance vs. \ntransformation: \nanatomical distance. Using neuromorphic renderings, we illustrated the electrotonic \nproperties of a previously unreported hippocampal CAl pyramidal neuron as viewed \nfrom the soma (cell 524, Figure 3). We also extended the use of the transformation to \nthe study of associative interactions between \"teacher\" and \"student\" synapses by \nanalyzing this cell from the viewpoint of a \"student\" synapse located in the apical \ndendrites, contrasting this result with a different cell that had a bifurcated primary \napical dendrite (cell 503, Figure 4). This demonstrates the versatility of the electrotonic \ntransformation, and shows how it can convey the electrical signaling properties of \nneurons in ways that are quickly and easily comprehended. \nThis understanding is important for several reasons. First, electrotonus affects the \nintegration and interaction of synaptic inputs, regulates voltage-dependent mechanisms \nof synaptic plasticity, and influences the interpretation of intracellular recordings. \nIn \naddition, phylogeny, development, aging, and response to injury and disease are all \naccompanied by alterations of neuronal morphology, some subtle and some profound. \n\n\f76 \n\nNicholas Carnevale, Kenneth Y. Tsai, Brenda J. Claiborne, Thomas H. Brown \n\nThe significance of these changes for brain function becomes clear only if their effects \non neuronal signaling are reckoned. Finally, there is good reason to expect that \nneuronal electrotonus is highly relevant to the design of artificial neural networks. \n\nAcknowledgments \n\nWe thank R.B. Gonzales and M.P. O'Boyle for their contributions to the morphometric \nanalysis, and Z.F. Mainen for assisting in the initial development of graphical \nrendering. This work was supported in part by ONR, ARPA, and the Yale Center for \nTheoretical and Applied Neuroscience (CTAN). \n\nReferences \n\nBrown, T.H., Zador, A.M. , Mainen, Z.F. and Claiborne, BJ. Hebbian computations in \nhippocampal dendrites and spines. In: Single Neuron Computation, eds. McKenna, T., \nDavis, J. and Zornetzer, S.F., New York, Academic Press, 1992, pp. 81-116. \n\nCarnevale, N. T. and Johnston, D.. Electrophysiological characterization of remote \nchemical synapses. J. Neurophysiol. 47:606-621, 1982. \n\nClaiborne, BJ. The use of computers for the quantitative, three-dimensional analysis of \ndendritic trees . In: Methods in Neuroscience. Vol. 10: Computers and Computation in \nthe Neurosciences, ed. Conn, P.M., New York, Academic Press, 1992, pp. 315-330. \n\nHines, M. A program for simulation of nerve equations with branching geometries. \nInternat. J. Bio-Med Computat. 24:55-68, 1989. \n\nRall, W. . Core conductor theory and cable properties of neurons. In: Handbook of \nPhysiology, The Nervous System, ed. Kandel, E.R., Bethesda, MD, Am. Physiol. Soc. , \n1977, pp.39-98. \n\nRihn, L.L. and Claiborne, BJ. Dendritic growth and regression in rat dentate granule \ncells during late postnatal development. Brain Res. Dev. 54(1): 115-24, 1990. \n\nSpruston, N. and Johnston, D. Perforated patch-clamp analysis of the passive \nmembrane properties of three classes of hippocampal neurons. J. Neurophysiol. 67 :508-\n529, 1992. \n\nTsai, K.Y., Carnevale, N.T. , Claiborne, BJ. and Brown, T.H. Morphoelectrotonic \ntransforms in three classes of rat hippocampal neurons. Soc. Neurosci. Abst. 19: 1522, \n1993. \n\nTsai, K.Y. , Carnevale, N.T. , Claiborne, BJ. and Brown, T.H. Efficient mapping from \nneuroanatomical to electrotonic space. Network 5:21-46, 1994. \n\nZador, A.M. , Claiborne, BJ. and Brown, T.H. Electrotonic transforms of hippocampal \nneurons. Soc. Neurosci. Abst. 17: 1515, 1991. \n\n\f", "award": [], "sourceid": 945, "authors": [{"given_name": "Nicholas", "family_name": "Carnevale", "institution": null}, {"given_name": "Kenneth", "family_name": "Tsai", "institution": null}, {"given_name": "Brenda", "family_name": "Claiborne", "institution": null}, {"given_name": "Thomas", "family_name": "Brown", "institution": null}]}