Mixture of the robust L1 distributions and its applications

  • Authors:
  • Junbin Gao;Richard Y. Xu

  • Affiliations:
  • School of Acc & Computer Science, Charles Sturt University, Bathurst, NSW, Australia;School of Acc & Computer Science, Charles Sturt University, Bathurst, NSW, Australia

  • Venue:
  • AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

Recently a robust probabilistic L1-PCA model was introduced in [1] by replacing the conventional Gaussian noise model with the Laplacian L1 model. Due to the heavy tail characteristics of the L1 distribution, the proposed model is more robust against data outliers. In this paper, we generalized the L1-PCA into a mixture of L1-distributions so that the model can be used for possible multiclustering data. For the model learning we use the property that the L1 density can be expanded as a superposition of infinite number of Gaussian densities to include a tractable Bayesian learning and inference based on the variational EM-type algorithm.