An analysis of entropy estimators for blind source separation

  • Authors:
  • Kenneth E. Hild, II;Deniz Erdogmus;Jose C. Principe

  • Affiliations:
  • Department of Radiology, The University of California at San Francisco, San Francisco, CA;Departments of Computer Science and Engineering and Biomedical Engineering, Oregon Health & Science University, Beaverton, OR;Department of Electrical and Computer Engineering, The University of Florida, Gainesville, FL

  • Venue:
  • Signal Processing
  • Year:
  • 2006

Quantified Score

Hi-index 0.08

Visualization

Abstract

An extensive analysis of a non-parametric, information-theoretic method for instantaneous blind source separation (BSS) is presented. As a result a modified stochastic information gradient estimator is proposed to reduce the computational complexity and to allow the separation of sub-Gaussian sources. Interestingly, the modification enables the method to simultaneously exploit spatial and spectral diversity of the sources. Consequently, the new algorithm is able to separate i.i.d, sources, which requires higher-order spatial statistics, and it is also able to separate temporally correlated Gaussian sources, which requires temporal statistics. Three reasons are given why Renyi's entropy estimators for Information-Theoretic Learning (ITL), on which the proposed method is based, is to be preferred over Shannon's entropy estimators for ITL. Also contained herein is an extensive comparison of the proposed method with JADE, Infomax, Comon's MI, FastICA, and a non-parametric, information-theoretic method that is based on Shannon's entropy. Performance comparisons are shown as a function of the data length, source kurtosis, number of sources, and stationarity/correlation of the sources.