Regularized Alternating Least Squares Algorithms for Non-negative Matrix/Tensor Factorization

  • Authors:
  • Andrzej Cichocki;Rafal Zdunek

  • Affiliations:
  • Laboratory for Advanced Brain Signal Processing, RIKEN BSI, Wako-shi, Saitama 351-0198, Japan;Laboratory for Advanced Brain Signal Processing, RIKEN BSI, Wako-shi, Saitama 351-0198, Japan

  • Venue:
  • ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks, Part III
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Nonnegative Matrix and Tensor Factorization (NMF/NTF) and Sparse Component Analysis (SCA) have already found many potential applications, especially in multi-way Blind Source Separation (BSS), multi-dimensional data analysis, model reduction and sparse signal/image representations. In this paper we propose a family of the modified Regularized Alternating Least Squares (RALS) algorithms for NMF/NTF. By incorporating regularization and penalty terms into the weighted Frobenius norm we are able to achieve sparse and/or smooth representations of the desired solution, and to alleviate the problem of getting stuck in local minima. We implemented the RALS algorithms in our NMFLAB/NTFLAB Matlab Toolboxes, and compared them with standard NMF algorithms. The proposed algorithms are characterized by improved efficiency and convergence properties, especially for large-scale problems.