Bayesian Unsupervised Learning for Source Separation with Mixture of Gaussians Prior

  • Authors:
  • Hichem Snoussi;Ali Mohammad-Djafari

  • Affiliations:
  • Laboratoire des Signaux et Systèmes (CNRS, SUPÉLEC, UPS), SUPÉLEC, Plateau de Moulon, 91192 Gif-sur-Yvette Cedex, France;Laboratoire des Signaux et Systèmes (CNRS, SUPÉLEC, UPS), SUPÉLEC, Plateau de Moulon, 91192 Gif-sur-Yvette Cedex, France

  • Venue:
  • Journal of VLSI Signal Processing Systems
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper considers the problem of source separation in the case of noisy instantaneous mixtures. In a previous work [1], sources have been modeled by a mixture of Gaussians leading to an hierarchical Bayesian model by considering the labels of the mixture as i.i.d hidden variables. We extend this modelization to incorporate a Markovian structure for the labels. This extension is important for practical applications which are abundant: unsupervised classification and segmentation, pattern recognition and speech signal processing.In order to estimate the mixing matrix and the a priori model parameters, we consider observations as incomplete data. The missing data are sources and labels: sources are missing data for observations and labels are missing data for incomplete missing sources. This hierarchical modelization leads to specific restoration maximization type algorithms. Restoration step can be held in three different manners: (i) Complete likelihood is estimated by its conditional expectation. This leads to the EM (expectation-maximization) algorithm [2], (ii) Missing data are estimated by their maximum a posteriori. This leads to JMAP (Joint maximum a posteriori) algorithm [3], (iii) Missing data are sampled from their a posteriori distributions. This leads to the SEM (stochastic EM) algorithm [4]. A Gibbs sampling scheme is implemented to generate missing data. We have also introduced a relaxation strategy into these algorithms to reduce the computational cost which is due to the exponential influence of the number of source components and the number of the mixture Gaussian components.