Algorithms for Independent Components Analysis and Higher Order Statistics

Part of Advances in Neural Information Processing Systems 12 (NIPS 1999)

Bibtex Metadata Paper

Authors

Daniel Lee, Uri Rokni, Haim Sompolinsky

Abstract

A latent variable generative model with finite noise is used to de(cid:173) scribe several different algorithms for Independent Components Anal(cid:173) ysis (lCA). In particular, the Fixed Point ICA algorithm is shown to be equivalent to the Expectation-Maximization algorithm for maximum likelihood under certain constraints, allowing the conditions for global convergence to be elucidated. The algorithms can also be explained by their generic behavior near a singular point where the size of the opti(cid:173) mal generative bases vanishes. An expansion of the likelihood about this singular point indicates the role of higher order correlations in determin(cid:173) ing the features discovered by ICA. The application and convergence of these algorithms are demonstrated on a simple illustrative example.