Abstract 
Independent component analysis is a probabilistic method for learning
a linear transform of a random vector. The goal is to find components
which are maximally independent and nonGaussian (nonnormal). Its
fundamental difference to classical multivariate statistical methods
is in the assumption of nonGaussianity, which enables the
identification of original, underlying components, in contrast to
classical methods.
In this talk, I provide an overview to the theory of ICA, as well as
an overview of recent developments in the theory. The main recent
topics are: testing independent components, analysing multiple data
sets (threeway data), analysis of causal relations, modelling
dependencies between the components, and improved methods for
estimating the basic model.
