Elements of information theory
Elements of information theory
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Learning from Examples with Information Theoretic Criteria
Journal of VLSI Signal Processing Systems
Efficiency of high-order moment estimates
IEEE Transactions on Signal Processing
Equivariant adaptive source separation
IEEE Transactions on Signal Processing
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
Generalized information potential criterion for adaptive system training
IEEE Transactions on Neural Networks
Is the general form of Renyi's entropy a contrast for source separation?
ICA'07 Proceedings of the 7th international conference on Independent component analysis and signal separation
Robust independent component analysis using quadratic negentropy
ICA'07 Proceedings of the 7th international conference on Independent component analysis and signal separation
A linear discriminant analysis method based on mutual information maximization
Pattern Recognition
An EM method for spatio-temporal blind source separation using an AR-MOG source model
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Hi-index | 0.08 |
An extensive analysis of a non-parametric, information-theoretic method for instantaneous blind source separation (BSS) is presented. As a result a modified stochastic information gradient estimator is proposed to reduce the computational complexity and to allow the separation of sub-Gaussian sources. Interestingly, the modification enables the method to simultaneously exploit spatial and spectral diversity of the sources. Consequently, the new algorithm is able to separate i.i.d, sources, which requires higher-order spatial statistics, and it is also able to separate temporally correlated Gaussian sources, which requires temporal statistics. Three reasons are given why Renyi's entropy estimators for Information-Theoretic Learning (ITL), on which the proposed method is based, is to be preferred over Shannon's entropy estimators for ITL. Also contained herein is an extensive comparison of the proposed method with JADE, Infomax, Comon's MI, FastICA, and a non-parametric, information-theoretic method that is based on Shannon's entropy. Performance comparisons are shown as a function of the data length, source kurtosis, number of sources, and stationarity/correlation of the sources.