An information theoretic approach to joint approximate diagonalization

  • Authors:
  • Yoshitatsu Matsuda;Kazunori Yamaguchi

  • Affiliations:
  • Department of Integrated Information Technology, Aoyama Gakuin University, Sagamihara-shi, Kanagawa, Japan;Department of General Systems Studies, Graduate School of Arts and Sciences, The University of Tokyo, Meguro-ku, Tokyo, Japan

  • Venue:
  • ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part I
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Joint approximate diagonalization (JAD) is a solution for blind source separation, which can extract non-Gaussian sources without any other prior knowledge. However, because JAD is based on an algebraic approach, it is not robust when the sample size is small. Here, JAD is improved by an information theoretic approach. First, the “true” probabilistic distribution of diagonalized cumulants in JAD is estimated under some simple conditions. Next, a new objective function is defined as the Kullback-Leibler divergence between the true distribution and the estimated one of current cumulants. Though it is similar to the usual JAD objective function, it has a positive lower bound. Then, an improvement of JAD with the lower bound is proposed. Numerical experiments verify the validity of this approach for a small number of samples.