Statistical Dynamics of Batch Learning

Part of Advances in Neural Information Processing Systems 12 (NIPS 1999)

Bibtex Metadata Paper

Authors

Song Li, K. Y. Michael Wong

Abstract

An important issue in neural computing concerns the description of learning dynamics with macroscopic dynamical variables. Recen(cid:173) t progress on on-line learning only addresses the often unrealistic case of an infinite training set. We introduce a new framework to model batch learning of restricted sets of examples, widely applica(cid:173) ble to any learning cost function, and fully taking into account the temporal correlations introduced by the recycling of the examples. For illustration we analyze the effects of weight decay and early stopping during the learning of teacher-generated examples.