Is the general form of Renyi's entropy a contrast for source separation?

  • Authors:
  • Frédéric Vrins;Dinh-Tuan Pham;Michel Verleysen

  • Affiliations:
  • Machine Learning Group, Université catholique de Louvain, Louvain-la-Neuve, Belgium;Laboratoire de Modélisation et Calcul, Centre National de la Recherche Scientifique, Grenoble, France;Machine Learning Group, Université catholique de Louvain, Louvain-la-Neuve, Belgium

  • Venue:
  • ICA'07 Proceedings of the 7th international conference on Independent component analysis and signal separation
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Renyi's entropy-based criterion has been proposed as an objective function for independent component analysis because of its relationship with Shannon's entropy and its computational advantages in specific cases. These criteria were suggested based on "convincing" experiments. However, there is no theoretical proof that globally maximizing those functions would lead to separate the sources; actually, this was implicitly conjectured. In this paper, the problem is tackled in a theoretical way; it is shown that globally maximizing the Renyi's entropy-based criterion, in its general form, does not necessarily provide the expected independent signals. The contrast function property of the corresponding criteria simultaneously depend on the value of the Renyi parameter, and on the (unknown) source densities.