Learning Stochastic Perceptrons Under k-Blocking Distributions

Part of Advances in Neural Information Processing Systems 7 (NIPS 1994)

Bibtex Metadata Paper

Authors

Mario Marchand, Saeed Hadjifaradji

Abstract

We present a statistical method that PAC learns the class of stochastic perceptrons with arbitrary monotonic activation func(cid:173) tion and weights Wi E {-I, 0, + I} when the probability distribution that generates the input examples is member of a family that we call k-blocking distributions. Such distributions represent an impor(cid:173) tant step beyond the case where each input variable is statistically independent since the 2k-blocking family contains all the Markov distributions of order k. By stochastic percept ron we mean a per(cid:173) ceptron which, upon presentation of input vector x, outputs 1 with probability fCLJi WiXi - B). Because the same algorithm works for any monotonic (nondecreasing or nonincreasing) activation func(cid:173) tion f on Boolean domain, it handles the well studied cases of sigmolds and the "usual" radial basis functions.