Synaptic Weight Noise During MLP Learning Enhances Fault-Tolerance, Generalization and Learning Trajectory

Part of Advances in Neural Information Processing Systems 5 (NIPS 1992)

Bibtex Metadata Paper

Authors

Alan Murray, Peter Edwards

Abstract

We analyse the effects of analog noise on the synaptic arithmetic during MultiLayer Perceptron training, by expanding the cost func(cid:173) tion to include noise-mediated penalty terms. Predictions are made in the light of these calculations which suggest that fault tolerance, generalisation ability and learning trajectory should be improved by such noise-injection. Extensive simulation experiments on two distinct classification problems substantiate the claims. The re(cid:173) sults appear to be perfectly general for all training schemes where weights are adjusted incrementally, and have wide-ranging implica(cid:173) tions for all applications, particularly those involving "inaccurate" analog neural VLSI.