{"title": "Neuromorphic Networks Based on Sparse Optical Orthogonal Codes", "book": "Neural Information Processing Systems", "page_first": 814, "page_last": 823, "abstract": null, "full_text": "814 \n\nNEUROMORPHIC NETWORKS BASED \n\nON SPARSE OPTICAL ORTHOGONAL CODES \n\nMario P. Vecchi and Jawad A. Salehi \n\nBell Communications Research \n\n435 South Street \n\nMorristown, NJ 07960-1961 \n\nAbstrad \n\nA family of neuromorphic networks specifically designed for communications \nand optical signal processing applications is presented. The information is encoded \nutilizing sparse Optical Orthogonal Code sequences on the basis of unipolar, binary \n(0,1) signals. The generalized synaptic connectivity matrix is also unipolar, and \nclipped to binary (0,1) values. In addition to high-capacity associative memory, \nthe resulting neural networks can be used to implement general functions, such as \ncode filtering, code mapping, code joining, code shifting and code projecting. \n\n1 \n\nIntroduction \n\nSynthetic neural nets[1,2] represent an active and growing research field. Fundamental \nissues, as well as practical implementations with electronic and optical devices are being \nstudied. In addition, several learning algorithms have been studied, for example stochas(cid:173)\ntically adaptive systems[3] based on many-body physics optimization concepts[4,5]. \n\nSignal processing in the optical domain has also been an active field of research. \n\nA wide variety of non-linear all-optical devices are being studied, directed towards ap(cid:173)\nplications both in optical computating and in optical switching. In particular, the \ndevelopment of Optical Orthogonal Codes (OOC)[6] is specifically interesting to opti(cid:173)\ncal communications applications, as it has been demonstrated in the context of Code \nDivision Multiple Access (CDMA)[7] . \n\nIn this paper we present a new class of neuromorphic networks, specifically designed \nfor optical signal processing and communications, that encode the information in sparse \nOOC's. In Section 2 we review some basic concepts. The new neuromorphic networks \nare defined in Section 3, and their associative memory properties are presented in Section \n4. In Section 5 other general network functions are discussed. Concluding remarks are \ngiven in Section 6. \n\n2 Neural Networks and Optical Orthogonal Codes \n\n2.1 Neural Network Model \n\nNeural network are generally based on multiply-threshold-feedback cycles. In the Hop(cid:173)\nfield model[2], for instance, a connectivity T matrix stores the M different memory \nelements, labeled m, by the sum of outer products, \n\nM \n\nTij=Lu'iuj; i,j=1,2 ... N \n\nm \n\n\u00a9 American Institute of Physics 1988 \n\n(1) \n\n\fwhere the state vectors ym represent the memory elements in the bipolar (-1,1) basis. \nThe diagonal matrix elements in the Hopfield model are set to zero, Tii = O. \n\n815 \n\nFor a typical memory recall cycle, an input vector .!lin, which is close to a particular \nmemory element m = k, multiplies the T matrix, such that the output vector .!lout is \ngiven by \n\nN \n\n\u2022 out ~T. in \nVi = L.J \nijVj \nj=l \n\ni,j = l,2 ... N \n\n(2) \n\n(3) \n\nand can be seen to reduce to \n\nvit ~ (N - l)u~ + J(N - l)(M - 1) \n\nfor large N and in the case of randomly coded memory elements ym. \n\nIn the Hopfield model, each output ~out is passed through a thresholding stage \naround zero. The thresholded output signals are then fed back, and the multiply and \nthreshold cycle is repeated until a final stable output .!lout is obtained. IT the input .!lin is \nsufficiently close to y1c, and the number of state vectors is small (Le. M ~ N), the final \noutput will converge to memory element m = k, that is, .!lout -+ y1c. The associative \nmemory property of the network is thus established. \n\n2.2 Optical Orthogonal Codes \n\nThe OOC sequences have been developed[6,7] for optical CDMA systems. Their prop(cid:173)\nerties have been specifically designed for this purpose, based on the following two con(cid:173)\nditions: each sequence can be easily distinguished from a shifted version of itself, and \neach sequence can be easily distinguished from any other shifted or unshifted sequence \nin the set. Mathematically, the above two conditions are expressed in terms of auto(cid:173)\nand crosscorrelation functions. Because of the non-negative nature of optical signals 1 , \nOOC are based on unipolar (0,1) signals[7]. \n\nIn general, a family of OOC is defined by the following parameters: \n\n- F, the length of the code, \n\n- K, the weight of the code, that is, the number of l's in the sequence, \n- >.a, the auto-correlation value for all possible shifts, other than the zero shift, \n\n- Ac , the cross-correlation value for all possible shifts, including the zero shift. \n\nFor a given code length F, the maximum number of distinct sequences in a family \nof OOC depends on the chosen parameters, that is, the weight of the code K and the \nallowed overlap AaandAc. In this paper we will consider OOC belonging to the minimum \noverlap class, Aa = Ac = 1. \n\nlWe refer to optical inten6ity signals, and not to detection systems sensitive to phase information. \n\n\f816 \n\n3 Neuromorphic Optical Networks \n\nOur neuromorphic networks are designed to take full advantage of the properties of the \n~OC. The connectivity matrix T is defined as a sum of outer products, by analogy with \n(1), but with the following important modifications: \n\n1. The memory vectors are defined by the sequences of a given family of OOC, with a \nbasis given by the unipolar, binary pair (0,1). The dimension of the sparse vectors \nis given by the length of the code F, and the maximum number of available items \ndepends on the chosen family of ~OC. \n\n2. All ofthe matrix elements Ti; are clipped to unipolar, binary (0,1) values, resulting \nin a sparse and simplified connectivity matrix, without any loss in the functional \nproperties defined by our neuromorphic networks. \n\n3. The diagonal matrix elements Tii are not set to zero, as they reflect important \n\ninformation implicit in the OOC sequences. \n\n4. The threshold value is not zero, but it is chosen to be equal to K, the weight of \n\nthe ~OC. \n\n5. The connectivity matrix T is generalized to allow for the possibility of a variety \nof outer product options: self-outer products, as in (1), for associative memory, \nbut also cross-outer products of different forms to implement various other system \nfunctions. \n\nA simplified schematic diagram of a possible optical neuromorphic processor is shown \nin Figure 1. This implementation is equivalent to an incoherent optical matrix-vector \nmultiplier[8], with the addition of nonlinear functions. The input vector is clipped using \nan optical hard-limiter with a threshold setting at 1, and then it is anamorphic ally \nimaged onto the connectivity mask for T. In this way, the ith pixel of the input vector \nis imaged onto the ith column of the T mask. The light passing through the mask is \nthen anamorphically imaged onto a line of optical threshold elements with a threshold \nsetting equal to K, such that the jth row is imaged onto the lh threshold element. \n\n4 Associative Memory \n\nThe associative memory function is defined by a connectivity matrix TMEM given by: \n\n(4) \n\nwhere each memory element ~m corresponds to a given sequence of the OOC family, \nwith code length F. The matrix elements of TMEM are all clipped, unipolar values, as \nindicated by the function gn, such that, \n\ng{ (} = { 1 if ( ~ 1 \n\u00b0 if ( < 1 \n\n(5) \n\n\fWe will now show that an input vector ~Ie, which corresponds to memory element \nm = k, will produce a stable output (equal to the wanted memory vector) in a single \npass of the multiply and threshold process. \n\nThe multiplication can be written as: \n\n817 \n\n(6) \nWe remember that the non-linear clipping function an is to be applied first to obtain \n\n-MEM \nT \n\n. Hence, \n\nv~t = ~:z:'! a {:z:'!:z:'! + ~ :z:~:z:'!'} \n\n'J L.J'J \n\n, \n\nL.JJ \nj \n\nm#;1e \n\n(7) \n\nFor :z:~ = 0, only the second term in (7) contributes, and the pseudo-orthogonality \n\nproperties of the OOC allow us to write: \n\n(8) \n\nwhere the cross-correlation value is Ac < K. \n\nFor :z:~ = 1, we again consider the properties of the OOC to obtain for the first term \n\nof (7): \n\n(9) \n\nwhere K is the weight of the OOC. \n\nTherefore, the result of the multiplication operation given by (7) can be written as: \n\nA out K \nVi = \n\nIe \n\n:Z:i + \n\n[value strictly 1 \n\nless than K \n\n(10) \n\nThe thresholding operation follows, around the value K as explained in Section 3. \n\nThat is, (10) is thresholded such that: \n\nvit = \n\n{ \n\nif v~t > K \n1 \no ifv~t < K \n\n, -\n, \n\n, \n\n(11) \n\nhence, the final output at the end of a single pass will be given by: v:ut = :z:~. \n\nThe result just obtained can be extended to demonstrate the single pass convergence \nwhen the input vector is close, but not necessarily equal, to a stored memory element. \nWe can draw the following conclusions regarding the properties of our neuromorphic \nnetworks based on OOC: \n\n\u2022 For any given input vector ~in, the single pass output will correspond to the \n\nmemory vector ~m which has the smallest Hamming distance to the input . \n\n\u2022 If the input vector ~in is missing a single 1-element from the K l's of an OOC, \n\nthe single pass output will be the null or zero vector. \n\n\f818 \n\n\u2022 If the input vector !lin has the same Hanuning distance to two (or more) memory \nvectors ~m , the single pass output will be the logical sum of those memory vectors. \n\nThe ideas just discussed were tested with a computer simulation. An example of \nassociative memory is shown in Table 1, corresponding to the OOC class of length \nF = 21 and weight K = 2. For this case, the maximum number of independent \nsequences is M = 10. The connectivity matrix TMEM is seen in Table 1, where one can \nclearly appreciate the simplifying features of our model, both in terms of the sparsity \nand of the unipolar, clipped values of the matrix elements. The computer simulations for \nthis example are shown in Table 2. The input vectors ~ and Q show the error-correcting \nmemory recovery properties. The input vector ~ is equally distant to memory vectors \ne3 and ~8, resulting in an output which is the sum (e 3 EB e8 ). And finally, input vector \nd is closest to ~\\ but one 1 is missing, and the output is the zero vector. The mask \nin Figure 1 shows the optical realization of the Table 1, where the transparent pixels \ncorrespond to the l's and the opaque pixels to the O's ofthe connectivity matrix TMEM. \nIt should be pointed out that the capacity of our network is significant. From the \nprevious example, the capacity is seen to be ::::: F /2 for single pass memory recovery. \nThis result compares favorably with the capacity of a Hopfield model[9], of ~ F / 41n F. \n\n5 General Network Functions \n\nOur neuromorphic networks, based on OOC, can be generalized to perform functions \nother than associative memory storage by constructing non-symmetrical connectivity \nmatrices. The single pass convergence of our networks avoids the possibility of limit(cid:173)\ncycle oscillations. We can write in general: \n\nTii = g{t Yf'Zj} , \n\nm=l \n\n(12) \n\nwhere each pair defined by m includes two vectors ym and em, which are not necessarily \nequal. The clipping function 9 {} insures that all m;:trix elements are binary (0,1) values. \nThe possible choice of vector pairs is not completely arbitrary, but there is a wide variety \nof functions that can be implemented for each family of OOC. We will now discuss some \nof the applications that are of particular interest in optical communication systems. \n\nS.l Code Filtering (CDMA) \n\nFigure 2 shows an optical CDMA network in a star configuration. M nodes are inter(cid:173)\nconnected with optical fibers to a passive MxM star coupler that broadcasts the optical \nsignals. At each node there is a data encoder that maps each bit of information to the \nOOC sequence corresponding to the user for which the transmission is intended. In \naddition, each node has a filter and decoder that recognizes its specific OOC sequence. \nThe optical transmission rate has been expanded by a factor F corresponding to the \nlength of the OOC sequence. Within the context of a CDMA communication system[7], \nthe filter or decoder must perform the function of recognizing a specific OOC sequence \nin the presence of other interfering codes sent on the common transmission medium. \n\n\fWe can think, then, of one of our neuromorphic networks as a filter, placed at a given \nreceiver node, that will recognize the specific code that it was programmed for. \n\nWe define for this purpose a connectivity matrix as \n\n819 \n\nT CDMA \n\nij \n\n=ziZj; 1.,}= \n\n\u2022\u2022 1 2 F \n, \n\n... \n\n, \n\nIe \n\nIe \n\n(13) \n\nwhere only one vector ~Ie is stored at each node. This symmetric, clipped connectivity \nmatrix will give an output equal to ~Ie whenever the input contains this vector, and a \nnull or zero output vector otherwise. It is clear by comparing (13) with (4) that the \nCDMA filtering matrix is equivalent to an associative memory matrix with only one \nitem imprinted in the memory. Hence the discussion of Section 4 directly applies to the \nunderstanding of the behaviour of T CDMA \n\nIn order to evaluate the performance of our neuromorphic network as a CDMA \nfilter, computer simulations were performed. Table 3 presents the T CDM A matrix for \na particular node defined by ~Ie of a CDMA system based on the OOC family F = 21, \nK = 2. The total number of distinct codes for this OOC family is M = 10, hence there \nare 9 additional OOC sequences that interfere with ~Ie, labeled in Table 3 ~l to ~9. \n\nThe performance was simulated by generating random composite sequences from the \nset of codes ~l to ~9 arbitrarily shifted. All inputs are unipolar and clipped (0,1) signals. \nThe results presented in Table 4 give examples of our simulation for the T CDMA matrix \nshown in Table 3. The input Q is the (logical) sum of a I-bit (vector ~Ie), plus interfering \nsignals from arbitrarily shifted sequences of ~2, ~3, ~4, ~6 and ~9. The output of the \nneuromorphic network is seen to recover accurately the desired vector ~Ie. The input \nvector Q contains a O-bit (null vector), plus the shifted sequences of ~l, ~2, ~3, ~6, ~7 \nand ~8, and we see that the output correctly recovers a O-bit. \n\nAs discussed in Section 4, our neuromorphic network will always correctly recognize \na I-bit (vector ~Ie) presented to its input. On the other hand 2, there is the possibility of \nmaking an error when a O-bit is sent, and the interfering signals from other nodes happen \nto generate the chip positions of ~Ie. This case is shown by input vector ~ of Table 4, \nwhich contains a O-bit (null vector), plus shifted sequences of ~2, ~3, ~4, ~6, ~6, ~7 and \n~8 in such a way that the output is erroneously given as a I-bit. The properties of the \nOOC sequences are specifically chosen to minimize these errors(7], and the statistical \nresults of our simulation are also shown in Table 4. It is seen that, as expected, when \na I-bit is sent it is always correctly recognized. On the other hand, when O-bits are \nsent, occasional errors occur. Our simulation, yields an overall bit error rate (BER) of \nBER.im = 5.88%, as shown in Table 4. \n\nThese results can be compared with theoretical calculations[7] which yield an esti(cid:173)\n\nmate for the BER for the CDMA system described: \n\nER \n\nB \n\ncalc~-\n\nK-l \n\n1 IT [ M-l-le] \n\n1-q \n\n2 Ie=O \n\n, \n\n(14) \n\nwhere q = 1 - ~. For the example of the OOC family F = 21, K = 2, with M = 10, \n\nthe above expression yields BERcalc :::::: 5.74%. \n\n20ur channel can be described, then, as a binary Z-channel between each two nodes dynamically \n\nestablishing a communication path \n\n\f820 \n\nIt is seen, therefore, that our neuromorphic network approaches the minimum pos(cid:173)\nsible BER for a given family of OOC. In fact, the results obtained usin~ our T CDMA \nare equivalent CDMA detection scheme based on \"optical-AND-gates,,[1 1, which cor(cid:173)\nresponds to the limiting BER determined by the properties of the OOC themselves3 . \nThe optical mask corresponding to the code filtering function is shown in Figure 3. \n\n5.2 Other Functions \n\nAs a first example of a non-symmetric T matrix, let us consider the function of mapping \nan input code to a corresponding different output code. We define our mapping matrix \nas: \n\nT;fAP = g {~Y'Zj} ; i,i = l,2 ... F, \n\n(15) \n\nwhere an input vector ~m will produce a different output vector code llm. \n\nThe function of code joining is defined by a transfer function that takes a given \ninput code and produces at the output a chosen combination of two or more codes. \nThis function is performed by expressing the general matrix given by 12 as follows: \n\nT J01N -\n-\n\nij \n\n'!:I ~ Yi + wi + .. , Zj \nr! { \" ( m m \n\n) m} \n\n. \nI &,1 -\n\n. . - 1 2 F \n, \n\n... \n\n, \n\n(16) \n\nwhere an input vector ~m will result in an output that joins several vector codes (Ilm E9 \nwmffi ... ). \n\nThe code shifting matrix TSHIFT will allow for the shift of a given code sequence, \nsuch that both input and output correspond to the same code, but shifted with respect \nto itself. That is, \n\nwhere we have indicated an unshifted code sequence by ~(O)m, and its corresponding \noutput pair as a shifted version of itself ~(s)m. \n\n(17) \n\nThe code projecting function corresponds to processing an input vector that contains \nthe logical sum of several codes, and projecting at the output a selected single code \nsequence. \n\ne correspon \n\n\u00b7 T PROJ . \n\nIS glven y: \n\ng matrIx \n\nTh \n\ndin\n\n. \n\nb \n\nT PROJ -\n\nij \n\n-'!:I ~Zi Yj +Wj +... \n\nr! { \" m( m m \n\n)}... - 1 2 F \n, \n\n1&,1-, ... \n\n(18) \n\nwhere each input vector (~m ffi w m ffi \n... ) will project at the output to a single code \n~m. In general, the resulting output code sequence ~m could correspond to a code not \nnecessarely contained in the input vector. \n\nThe performance and error correcting properties of these, and other, general func(cid:173)\n\ntions follow a similar behaviour as discussed in Section 4. \n\n3The BER for the OOC family shown in this example are far too large for a useful CDMA com(cid:173)\nmunications system. Our choice intended to show computer simulated results within a reasonable \ncomputation time. \n\n\f821 \n\n6 Conclusions \n\nThe neuromorphic networks presented, based on sparse Optical Orthogonal Code (OOC) \nsequences, have been shown to have a number of attractive properties. The unipolar, \nclipped nature of the synaptic connectivity matrix simplifies the implementation. The \nsingle pass convergence further allows for general network functions that are expected \nto be of particular interest in communications and signal processing systems. \n\nThe coding of the information, based on ~OC, has also been shown to result in high \n\ncapacity associative memories. The combination of efficient associative memory prop(cid:173)\nerties, plus a variety of general network functions, also suggests the possible application \nof our neuromorphic networks in the implementation of computational functions based \non optical symbolic substitution. \n\nThe family of neuromorphic networks discussed here emphasizes the importance of \nunderstanding the general properties of non-negative systems based on sparse codes[lll. \nIt is hoped that our results will stimulate further work on the fundamental relationship \nbetween coding, or representations, and the information processing properties of neural \nnets. \n\nAcknowledgement \n\nWe thank J. Y. N. Hui and J. Alspector for many useful discussions, and C. A. Brackett for his support \nand encouragement of this research. \n\nReferences \n\n[1] S. Grossberg. In K. Schmitt, editor, Delay and Functional-Differential Equation6 and Their Ap(cid:173)\n\nplicatioN, page 121, Academic Press, New York, NY, 1972. \n\n[2] J. J. Hopfield. Neural Networks and Physical Systems with Emergent Collective Computational \n\nAbilities. Proc. Nat. Acad. Sci. USA, 79:2254, 1982. \n\n[3] D. H. Ackley, G. E. Hinton, and T. J. Sejnowski. A Learning Algorithm for Boltzmann Machines. \n\nCogn. Sci., 9:147, 1985. \n\n[4] S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchio Optimization by Simulated Annealing. Science, \n\n220:671, 1983. \n\n[5] M. P. Vecchi and S. Kirkpatrick. Global Wiring by Simulated Annealing. IEEE Tran6. CAD of \n\nIntegrated Circuit. and Sydem6, CAD-2:215, 1983. \n\n[6] F. R. K. Chung, J. A. Salehi, and V. K . Wei. Optical Orthogonal Codes: Design, Analysis and \nApplications. In IEEE International Symp06ium on Information Theory, Catalog No. 86CH!374-7, \n1986. Accepted for publication in IEEE Trans. on Information Theory. \n\n[7] J. A. Salehi and C. A. Brackett. Fundamental Principles of Fiber Optics Code Division Multiple \n\nAccess. In IEEE International Conference on CommunicatiON, 1987. \n\n[8] N. H. Farhat, D. Psaltis, A. Prata, and E. Paek. Optical Implementation of the Hopfield Model. \n\nAppl. Opt., 24:1469, 1985. \n\n[9] R. J. McEliece, E. C. Posner, E. R. Rodemich, and S. S. Venkatesh. The Capacity of Hopfield \n\nAssociative Memory. IEEE Tran6. on Information Theory, IT-33:461, 1987. \n\n[10] J. A. Salehi. Principles and Applications of Optical AND Gates in Fiber Optics Code Division \n\nMultiple Access Networks. In preparation, 1987. \n\n[11] G. Palm. Technical comments. Science, 235:1226, 1987. \n\n\f822 \n\nT .... I: A_wi .. Yo..,. Eo ....... ~ ..... _ ... 0110 wI ... \n\n. . . ~ .. ~IO ......... 0OCf ..... \n\nI \n\n\u2022 \n\n\u2022 \n\n\u2022 \n\n\u2022 \n\n\u2022 \n\n1 \n\nCodoVod_ \n\neo..-lintJ Malrb r\u00b7\u00b71 \n\nOOC J'anolq. r ., II. It = I \n\n, \u00b7 . , \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n, 1 1 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n.- \u2022 \u2022 \u2022 I \u2022 \u2022 1 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \nI \u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 1 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \nI' \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \nI' \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n, \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 1 \u2022 \u2022 \u2022 \nI' \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \n\u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \nI- \u2022 \u2022 \u2022 \u2022 \u2022 1 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 \n\u2022 \u2022 \u2022 \u2022 \u2022 \n\u2022 \u2022 \u2022 \u2022 \u2022 \n1 1 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n\u2022 \u2022 I \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n\u2022 \u2022 \u2022 \u2022 \u2022 \n\u2022 \u2022 \u2022 I \u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 \n\u2022 \u2022 \u2022 \u2022 \u2022 \n\u2022 \u2022 \u2022 \u2022 \u2022 \n\u2022 \u2022 , \nI \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n\u2022 \u2022 \n\u2022 \u2022 \u2022 \u2022 I \u2022 \n\u2022 \u2022 \n\u2022 \n\u2022 \u2022 \n\u2022 \u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 I \n\u2022 \n\u2022 \n\u2022 \u2022 I \n\u2022 \n\u2022 \n\u2022 \n\u2022 \u2022 \u2022 1 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \u2022 \u2022 \u2022 \n\u2022 \n\u2022 \n\u2022 '. \n\u2022 \n\u2022 \u2022 \u2022 \u2022 \n\u2022 \n, \n\u2022 \n\u2022 \u2022 \n\u2022 \u2022 \u2022 \u2022 \n, \n\u2022 \n\u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 \n\u2022 \n\u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n\u2022 \n\n:1 \n\n1 \n\n1 \n\n\u2022 \n\nI \n\nI \n\nT .... I: A-o.&i .. 11-.,. ~ ~ ~ ... \n... ,...\" _ria P. ;;. 'bWo I. \n-_ .... -\n1IIr ............ a.-a.. ........... .....,._ \n\nr-..... -\n\nooc ....\" r \u2022 II, It .. I \n\nl \n\nL ..... V_ \n\n0.1, .. V ...... \n\no..~v ...... \n\n~ .. v..... \n\n............. W -_ \u2022 \u2022 \u2022 \n\n........... .u.'-\"-, \u2022\u2022 \n\nI \u2022 I \u2022 \u2022 I \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 I \n\n\u2022 I , \u2022 I \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 .: \n\u2022 I , \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 '1 \n\u2022 \" \n.. \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 D \nI \u2022 I \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 I \u2022 \n, ... \u2022 \u2022 \u2022 , \u2022 \u2022 I \n\u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \n~ .1\u00b7\u00b7\u00b7\u00b7\u00b7 ............. 0, \n\nI \u2022 \u2022 I \nIIIanmoI.& .. lot .... \" - r:: \u2022 \u2022 \n1 ............ 4Iot_\"-~ ... \n\n................. , . . . . \" - . I: I \n\nOut,.. v ...... \n\nLl., .. V_ \n\nC \n\n\u2022 \n\n\u2022 \n\n\u2022 \n\n\u2022 \n\n\u2022 \n\n\u2022 \n\nI \n\nOutp .. V ...... \n\n\u2022 \n\n\u2022 \n\n\u2022 \n\n, \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\" . \u2022 I \u2022 I \n~._IaJ .. ' ~ Co4.I \n\" ... I \u2022 \u2022 1 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \nI , \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n.1 \n\u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n, \u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n.. \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 1 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n\" \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \n\u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 1 \u2022 \nI \u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 .' \n, \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n___ ..... _ ... SOC\"',, _ .. p_. ~ \n'bWo., c.. FI\\orioc IC'DIIU). ~ so-IalioL ..... \n(' .... ~;~.--).\"\"\" .............. .. \n\"'1. EMIl ............ .., ................. ... \n.... \n.... - . -'iInriIr ................... - . \n\nOOC hili\u00a7! r. 1I,1t. I \n\n.1 \"1 \" \" . \" '1 \n\n\u2022 \u2022 \u2022 \u2022 \u2022 \u2022 J \na\" \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \nr .... V_= _. ' \u2022\u2022\u2022\u2022 ..-\u2022\u2022.\u2022 , .o' \n\nl \n\nI \u2022 \u2022 \u2022 \u2022 ' . , \u2022 \u2022 \u2022 \u2022 \" \n\n\u2022 \u2022 I \n\n\u2022 \n\n\u2022 \n\nOu'_V __ \n\n:&CHl \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n\nI.loop .. v_ .. r ..... ~l.a-' .,' \u2022\u2022\u2022 \" .-a-. I \u2022 a\"! \nI , \u2022 , , \u2022 , , \u2022 \u2022 I , \u2022 \nc T' \u2022\u2022 I , \u2022 \u2022 I \nC \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 , \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \nTogJ I \n\ns .. , ..... aI ..... no.. \n\nI\u00b7 WI \nMIOJf \n\n\u2022 \n\n... 11 \nN\u00bbI \nltol \n\nIl.\"\" \n\nI_II \nltol \n\nI ... \n\nOOC J'aaoi\\r1 r I: II. It = I \n\nI....,.\",.'\" Co4e ... H . . \n\u2022 \nCotuoec,a.II, Wain. r v. ' , \n\n\u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 I \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \n\n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n, \n\n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\u2022 \n\nUN \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 \u2022 ' 0 \n\nBU\" \n\n0-\n\n\f823 \n\nFigure 1: \n\nSchematic diagram of an optical neuromorphic pro(cid:173)\ncessor using sparse Optical Orthogonal Codes. No(cid:173)\ntice the absence oHeedback because ofthe single-pass \nconvergence. The mask shown represents the realisa(cid:173)\ntion of the content-addressable memory of Table 1. \n\nFigure 3: \n\nOptical realization oC a code filtering (CDMA) maslc \no( Table 3. The l's are represented by the transpar' \nent pixels, and the 0'. by the opaque pixels. \n\nNkll \nPASSM \nSTAll \n\nCOUI'lEIIS \n\nFigure 2: \n\nSchematic diagram of a COMA communications sys(cid:173)\ntem over an Optical Fiber interconnection network. \nEach node represents one of the M possible distinct \nusers in the system. \n\n\f", "award": [], "sourceid": 41, "authors": [{"given_name": "Mario", "family_name": "Vecchi", "institution": null}, {"given_name": "Jawad", "family_name": "Salehi", "institution": null}]}