On the Risk of Using RÉnyi's Entropy for Blind Source Separation

  • Authors:
  • Dinh-Tuan Pham;F. Vrins;M. Verleysen

  • Affiliations:
  • Lab. Jean Kuntzmann, Grenoble;-;-

  • Venue:
  • IEEE Transactions on Signal Processing - Part I
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, some researchers have suggested Renyi's entropy in its general form as a blind source separation (BSS) objective function. This was motivated by two arguments: (1) Shannon's entropy, which is known to be a suitable criterion for BSS, is a particular case of Renyi's entropy, and (2) some practical advantages can be obtained by choosing another specific value for the Renyi exponent, yielding to, e.g., quadratic entropy. Unfortunately, by doing so, there is no longer guarantee that optimizing this generalized criterion would lead to recovering the original sources. In this paper, we show that Renyi's entropy in its exact form (i.e., out of any consideration about its practical estimation or computation) might lead to not recovering the sources, depending on the source densities and on Renyi's exponent value. This is illustrated on specific examples. We also compare our conclusions with previous works involving Renyi's entropies for blind deconvolution.