Introduction to algorithms
High-order contrasts for independent component analysis
Neural Computation
Neural Computation
Multivariate Information Bottleneck
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
Kernel Independent Component Analysis
Kernel Independent Component Analysis
Adaptive mixtures of local experts
Neural Computation
Kernel independent component analysis
The Journal of Machine Learning Research
Blind separation of sources that have spatiotemporal variance dependencies
Signal Processing - Special issue on independent components analysis and beyond
Image and Vision Computing
A quasi-stochastic gradient algorithm for variance-dependent component analysis
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
Hi-index | 0.00 |
We present a generalization of independent component analysis (ICA), where instead of looking for a linear transform that makes the data components independent, we look for a transform that makes the data components well fit by a tree-structured graphical model. Treating the problem as a semiparametric statistical problem, we show that the optimal transform is found by minimizing a contrast function based on mutual information, a function that directly extends the contrast function used for classical ICA. We provide two approximations of this contrast function, one using kernel density estimation, and another using kernel generalized variance. This tree-dependent component analysis framework leads naturally to an efficient general multivariate density estimation technique where only bivariate density estimation needs to be performed.