Part of Advances in Neural Information Processing Systems 6 (NIPS 1993)
This paper presents a simple algorithm to learn trajectories with a continuous time, continuous activation version of the Boltzmann machine. The algorithm takes advantage of intrinsic Brownian noise in the network to easily compute gradients using entirely local computations. The algorithm may be ideal for parallel hardware implementations.
This paper presents a learning algorithm to train continuous stochastic networks to respond with desired trajectories in the output units to environmental input trajectories. This is a task, with potential applications to a variety of problems such as stochastic modeling of neural processes, artificial motor control, and continuous speech recognition . For example, in a continuous speech recognition problem, the input trajectory may be a sequence of fast Fourier transform coefficients, and the output a likely trajectory of phonemic patterns corresponding to the input. This paper was based on recent work on diffusion networks by Movellan and McClelland (in press) and by recent papers by Apolloni and de Falco (1991) and Neal (1992) on asymmetric Boltzmann machines. The learning algorithm can be seen as a generalization of their work to the stochastic diffusion case and to the problem of learning continuous stochastic trajectories.
Diffusion networks are governed by the standard connectionist differential equations plus an independent additive noise component. The resulting process is governed
·Pa.rt of this work was done while a.t Ca.rnegie Mellon University.