Elements of information theory
Elements of information theory
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Natural gradient works efficiently in learning
Neural Computation
Entropy Optimization - Application to Blind Source Separation
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Signal Processing - Special issue: Information theoretic signal processing
Signal Processing - Special issue: Information theoretic signal processing
Equivariant adaptive source separation
IEEE Transactions on Signal Processing
Mutual information approach to blind separation of stationary sources
IEEE Transactions on Information Theory
Hi-index | 0.01 |
Equivariant adaptive separation via independence (EASI) is one of the most successful algorithms for blind source separation (BSS). However, the user has to choose non-linearities, and usually simple (but non-optimal) cubic polynomials are applied. In this paper, the optimal choice of these non-linearities is addressed. We show that this optimal non-linearity is the output score function difference (SFD). Contrary to simple non-linearities usually used in EASI (such as cubic polynomials), the optimal choice is neither component-wise nor fixed: it is a multivariate function which depends on the output distributions. Finally, we derive three adaptive algorithms for estimating the SFD and achieving ''quasi-optimal'' EASI algorithms, whose separation performance is much better than ''standard'' EASI and which especially converges for any sources.