Scaling Laws and Local Minima in Hebbian ICA

Part of Advances in Neural Information Processing Systems 14 (NIPS 2001)

Bibtex Metadata Paper

Authors

Magnus Rattray, Gleb Basalyga

Abstract

We study the dynamics of a Hebbian ICA algorithm extracting a sin- gle non-Gaussian component from a high-dimensional Gaussian back- ground. For both on-line and batch learning we find that a surprisingly large number of examples are required to avoid trapping in a sub-optimal state close to the initial conditions. To extract a skewed signal at least  examples are required for  -dimensional data and  

 exam- ples are required to extract a symmetrical signal with non-zero kurtosis.