Part of Advances in Neural Information Processing Systems 1 (NIPS 1988)

*Eyal Yair, Allen Gersho*

The concept of the stochastic Boltzmann machine (BM) is auractive for decision making and pattern classification purposes since the probability of attaining the network states is a function of the network energy. Hence, the probability of attaining particular energy minima may be associated with the probabilities of making certain decisions (or classifications). However, because of its stochastic nature, the complexity of the BM is fairly high and therefore such networks are not very likely to be used in practice. In this paper we suggest a way to alleviate this drawback by converting the sto(cid:173) chastic BM into a deterministic network which we call the Boltzmann Per(cid:173) ceptron Network (BPN). The BPN is functionally equivalent to the BM but has a feed-forward structure and low complexity. No annealing is required. The conditions under which such a convmion is feasible are given. A learning algorithm for the BPN based on the conjugate gradient method is also provided which is somewhat akin to the backpropagation algorithm.

Do not remove: This comment is monitored to verify that the site is working properly