Part of Advances in Neural Information Processing Systems 6 (NIPS 1993)
Patrice Simard, Hans Graf
The back propagation algorithm has been modified to work with(cid:173) out any multiplications and to tolerate comput.ations with a low resolution, which makes it. more attractive for a hardware imple(cid:173) mentatioll. Numbers are represented in float.ing point format with 1 bit mantissa and 3 bits in the exponent for the states, and 1 bit mantissa and 5 bit exponent. for the gradients, while the weights are 16 bit fixed-point numbers. In this way, all the computations can be executed with shift and add operations. Large nehvorks with over 100,000 weights were t.rained and demonstrat.ed the same per(cid:173) formance as networks comput.ed with full precision. An estimate of a circuit implementatioll shows that a large network can be placed on a single chip , reaching more t.han 1 billion weight updat.es pel' second. A speedup is also obtained on any machine where a mul(cid:173) tiplication is slower than a shift operat.ioJl.