Criteria based on mutual information minimization for blind source separation in post nonlinear mixtures

  • Authors:
  • Sophie Achard;Dinh-Tuan Pham;Christian Jutten

  • Affiliations:
  • Brain Mapping Unit, Department of Psychiatry, University of Cambridge, Downing site, Cambridge, UK and University of Grenoble, IMAG, Laboratory of Modelling and Computation, Grenoble Cedex, France;University of Grenoble, IMAG, Laboratory of Modelling and Computation, Grenoble Cedex, France;University of Grenoble, INPG, Laboratory of Images and Signals, Grenoble Cedex, France

  • Venue:
  • Signal Processing - Special issue: Information theoretic signal processing
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This work deals with the problem of blind source separation solved by minimization of mutual information. After having chosen a model for the mixture, we focus on two methods. One is based on the minimization of an estimation of I, the mutual information. The other one uses a minimization of an estimation of C, the mutual information after transforming all the joint entropy terms. We show the differences between these two approaches by studying statistical properties of the two estimators.In this paper, we derive the bias of the estimators of the two criteria I and C. It is shown that under the hypothesis of independence, the estimator of I is asymptotically unbiased even if the bandwidth is kept fixed, whereas with a fixed bandwidth the estimator of C is not asymptotically unbiased.Further, the minimization is achieved by a relative gradient descent method and we show the differences between criteria I and C through the expression of their relative gradients.