We investigate the computational power of a formal model for net(cid:173) works of spiking neurons, both for the assumption of an unlimited timing precision, and for the case of a limited timing precision. We also prove upper and lower bounds for the number of examples that are needed to train such networks.
Introduction and Basic Definitions
There exists substantial evidence that timing phenomena such as temporal differ(cid:173) ences between spikes and frequencies of oscillating subsystems are integral parts of various information processing mechanisms in biological neural systems (for a survey and references see e.g. Abeles, 1991; Churchland and Sejnowski, 1992; Aert(cid:173) sen, 1993). Furthermore simulations of a variety of specific mathematical models for networks of spiking neurons have shown that temporal coding offers interesting possibilities for solving classical benchmark-problems such as associative memory, binding, and pattern segmentation (for an overview see Gerstner et al., 1992). Some aspects of these models have also been studied analytically, but almost nothing is known about their computational complexity (see Judd and Aihara, 1993, for some first results in this direction). In this article we introduce a simple formal model SNN for networks of spiking neurons that allows us to model the most important timing phenomena of neural nets (including synaptic modulation), and we prove up(cid:173) per and lower bounds for its computational power and learning complexity. Further