Elements of information theory
Elements of information theory
ICA using spacings estimates of entropy
The Journal of Machine Learning Research
On the entropy minimization of a linear mixture of variables for source separation
Signal Processing - Special issue: Information theoretic signal processing
Information-theoretic inequalities for contoured probability distributions
IEEE Transactions on Information Theory
Crame´r-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Renyi’s entropy can be used as a cost function for blind source separation (BSS). Previous works have emphasized the advantage of setting Renyi’s exponent to a value different from one in the context of BSS. In this paper, we focus on zero-order Renyi’s entropy minimization for the blind extraction of bounded sources (BEBS). We point out the advantage of choosing the extended zero-order Renyi’s entropy as a cost function in the context of BEBS, when the sources have non-convex supports.