Exploring Latent Structure of Mixture ICA Models by the Minimum β-Divergence Method

  • Authors:
  • Md. Nurul Haque Mollah;Mihoko Minami;Shinto Eguchi

  • Affiliations:
  • Department of Statistical Science, Graduate University for Advanced Studies, Minato-ku, Tokyo 106-8569, Japan;Institute of Statistical Mathematics and the Graduate University for Advanced Studies, Minato-ku, Tokyo 106-8569, Japan;Institute of Statistical Mathematics and the Graduate University for Advanced Studies, Minato-ku, Tokyo 106-8569, Japan

  • Venue:
  • Neural Computation
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Independent component analysis (ICA) attempts to extract original independent signals (source components) that are linearly mixed in a basic framework. This letter discusses a learning algorithm for the separation of different source classes in which the observed data follow a mixture of several ICA models, where each model is described by a linear combination of independent and nongaussian sources. The proposed method is based on a sequential application of the minimum β-divergence method to separate all source classes sequentially. The proposed method searches the recovering matrix of each class on the basis of a rule of sequential change of the shifting parameter. If the initial choice of the shifting parameter vector is close to the mean of a data class, then all of the hidden sources belonging to that class are recovered properly with independent and nongaussian structure considering the data in other classes as outliers. The value of the tuning parameter β is a key in the performance of the proposed method. A cross-validation technique is proposed as an adaptive selection procedure for the tuning parameter β for this algorithm, together with applications for both real and synthetic data analysis.