Hierarchies of adaptive experts

Part of Advances in Neural Information Processing Systems 4 (NIPS 1991)

Bibtex Metadata Paper


Michael Jordan, Robert Jacobs


In this paper we present a neural network architecture that discovers a recursive decomposition of its input space. Based on a generalization of the modular architecture of Jacobs, Jordan, Nowlan, and Hinton (1991), the architecture uses competition among networks to recursively split the input space into nested regions and to learn separate associative mappings within each region. The learning algorithm is shown to perform gradient ascent in a log likelihood function that captures the architecture's hierarchical structure.