A latent variable generative model with finite noise is used to describe sev- eral different algorithms for Independent Components Analysis (ICA). In particu- lar, the Fixed Point ICA algorithm is shown to be equivalent to the Expectation- Maximization algorithm for maximum likelihood under certain constraints, allow- ing the conditions for global convergence to be elucidated. The algorithms can also be explained by their generic behavior near a singular point where the size of the optimal generative bases vanishes. An expansion of the likelihood about this singular point indicates the role of higher order correlations in determining the features discovered by ICA. The application and convergence of these algorithms are demonstrated on the learning of edge features as the independent components of natural images.
Publications
Home » Publications » Algorithms for independent components analysis and higher order statistics
Algorithms for independent components analysis and higher order statistics
Authors: Lee D., Rokni U., Sompolinsky H.
Year of publication: 1999
Journal: Advances in Neural Information Processing Systems 12. Kearns M.J., Solla S.A., and Cohn D.A., Editors. MIT Press, Cambridge MA. (2000)
Link to publication:
Labs: