An Efficient Implementation of the Back-propagation Algorithm on the Connection Machine CM-2

Part of Advances in Neural Information Processing Systems 2 (NIPS 1989)

Bibtex Metadata Paper


Xiru Zhang, Michael McKenna, Jill Mesirov, David Waltz


In this paper, we present a novel implementation of the widely used Back-propagation neural net learning algorithm on the Connection Machine CM-2 - a general purpose, massively parallel computer with a hypercube topology. This implementation runs at about 180 million interconnections per second (IPS) on a 64K processor CM- 2. The main interprocessor communication operation used is 2D nearest neighbor communication. The techniques developed here can be easily extended to implement other algorithms for layered neural nets on the CM-2, or on other massively parallel computers which have 2D or higher degree connections among their processors.