Csiszár’s divergences for non-negative matrix factorization: family of new algorithms

  • Authors:
  • Andrzej Cichocki;Rafal Zdunek;Shun-ichi Amari

  • Affiliations:
  • Laboratory for Advanced Brain Signal Processing;Laboratory for Advanced Brain Signal Processing;Laboratory for Mathematical Neuroscience, RIKEN BSI, Wako-shi, Japan

  • Venue:
  • ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we discus a wide class of loss (cost) functions for non-negative matrix factorization (NMF) and derive several novel algorithms with improved efficiency and robustness to noise and outliers. We review several approaches which allow us to obtain generalized forms of multiplicative NMF algorithms and unify some existing algorithms. We give also the flexible and relaxed form of the NMF algorithms to increase convergence speed and impose some desired constraints such as sparsity and smoothness of components. Moreover, the effects of various regularization terms and constraints are clearly shown. The scope of these results is vast since the proposed generalized divergence functions include quite large number of useful loss functions such as the squared Euclidean distance,Kulback-Leibler divergence, Itakura-Saito, Hellinger, Pearson’s chi-square, and Neyman’s chi-square distances, etc. We have applied successfully the developed algorithms to blind (or semi blind) source separation (BSS) where sources can be generally statistically dependent, however they satisfy some other conditions or additional constraints such as nonnegativity, sparsity and/or smoothness.