Elements of information theory
Elements of information theory
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Jacobi Angles for Simultaneous Diagonalization
SIAM Journal on Matrix Analysis and Applications
A fast fixed-point algorithm for independent component analysis
Neural Computation
New approximations of differential entropy for independent component analysis and projection pursuit
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
High-order contrasts for independent component analysis
Neural Computation
Kernel independent component analysis
The Journal of Machine Learning Research
Blind separation of instantaneous mixture of sources based on orderstatistics
IEEE Transactions on Signal Processing
Estimating the entropy of a signal with applications
IEEE Transactions on Signal Processing
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
Blind source separation by nonstationarity of variance: a cumulant-based approach
IEEE Transactions on Neural Networks
Kernel Methods for Measuring Independence
The Journal of Machine Learning Research
Channel selection and feature projection for cognitive load estimation using ambulatory EEG
Computational Intelligence and Neuroscience - EEG/MEG Signal Processing
ICA and ISA using Schweizer-Wolff measure of dependence
Proceedings of the 25th international conference on Machine learning
A general procedure for learning mixtures of independent component analyzers
Pattern Recognition
Optimization on the orthogonal group for independent component analysis
ICA'07 Proceedings of the 7th international conference on Independent component analysis and signal separation
Robust independent component analysis using quadratic negentropy
ICA'07 Proceedings of the 7th international conference on Independent component analysis and signal separation
Statistical inference for the ε-entropy and the quadratic Rényi entropy
Journal of Multivariate Analysis
A computationally efficient information estimator for weighted data
ICANN'11 Proceedings of the 21st international conference on Artificial neural networks - Volume Part II
Measuring statistical dependence with hilbert-schmidt norms
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Separation theorem for independent subspace analysis and its consequences
Pattern Recognition
Fast kernel density independent component analysis
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Zero-Entropy minimization for blind extraction of bounded sources (BEBS)
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
A comparison of linear ICA and local linear ICA for mutual information based feature ranking
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Newton-like methods for nonparametric independent component analysis
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
LVA/ICA'12 Proceedings of the 10th international conference on Latent Variable Analysis and Signal Separation
Hybrid linear and nonlinear complexity pursuit for blind source separation
Journal of Computational and Applied Mathematics
Adaptive EEG artifact rejection for cognitive games
Proceedings of the 14th ACM international conference on Multimodal interaction
Information estimators for weighted observations
Neural Networks
Hi-index | 0.00 |
This paper presents a new algorithm for the independent components analysis (ICA) problem based on an efficient entropy estimator. Like many previous methods, this algorithm directly minimizes the measure of departure from independence according to the estimated Kullback-Leibler divergence between the joint distribution and the product of the marginal distributions. We pair this approach with efficient entropy estimators from the statistics literature. In particular, the entropy estimator we use is consistent and exhibits rapid convergence. The algorithm based on this estimator is simple, computationally efficient, intuitively appealing, and outperforms other well known algorithms. In addition, the estimator's relative insensitivity to outliers translates into superior performance by our ICA algorithm on outlier tests. We present favorable comparisons to the Kernel ICA, FAST-ICA, JADE, and extended Infomax algorithms in extensive simulations. We also provide public domain source code for our algorithms.