Probability Estimation from a Database Using a Gibbs Energy Model

Part of Advances in Neural Information Processing Systems 5 (NIPS 1992)

Bibtex Metadata Paper

Authors

John Miller, Rodney Goodman

Abstract

We present an algorithm for creating a neural network which pro(cid:173) duces accurate probability estimates as outputs. The network im(cid:173) plements a Gibbs probability distribution model of the training database. This model is created by a new transformation relating the joint probabilities of attributes in the database to the weights (Gibbs potentials) of the distributed network model. The theory of this transformation is presented together with experimental re(cid:173) sults. One advantage of this approach is the network weights are prescribed without iterative gradient descent. Used as a classifier the network tied or outperformed published results on a variety of databases.