Sparse Super Symmetric Tensor Factorization

  • Authors:
  • Andrzej Cichocki;Marko Jankovic;Rafal Zdunek;Shun-Ichi Amari

  • Affiliations:
  • RIKEN Brain Science Institute, Wako-shi, Saitama, Japan;RIKEN Brain Science Institute, Wako-shi, Saitama, Japan;RIKEN Brain Science Institute, Wako-shi, Saitama, Japan;RIKEN Brain Science Institute, Wako-shi, Saitama, Japan

  • Venue:
  • Neural Information Processing
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In the paper we derive and discuss a wide class of algorithms for 3D Super-symmetric Nonnegative Tensor Factorization (SNTF) or nonnegative symmetric PARAFAC, and as a special case: Symmetric Nonnegative Matrix Factorization (SNMF) that have many potential applications, including multi-way clustering, feature extraction, multi- sensory or multi-dimensional data analysis, and nonnegative neural sparse coding. The main advantage of the derived algorithms is relatively low complexity, and in the case of multiplicative algorithms possibility for straightforward extension of the algorithms to L-order tensors factorization due to some nice symmetric property. We also propose to use a wide class of cost functions such as Squared Euclidean, Kullback Leibler I-divergence, Alpha divergence and Beta divergence. Preliminary experimental results confirm the validity and good performance of some of these algorithms, especially when the data have sparse representations.