Are tensor decomposition solutions unique? on the Global convergence HOSVD and parafac algorithms

  • Authors:
  • Dijun Luo;Chris Ding;Heng Huang

  • Affiliations:
  • Department of Computer Science and Engineering, University of Texas, Arlington, Texas;Department of Computer Science and Engineering, University of Texas, Arlington, Texas;Department of Computer Science and Engineering, University of Texas, Arlington, Texas

  • Venue:
  • PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part I
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Matrix factorizations and tensor decompositions are now widely used in machine learning and data mining. They decompose input matrix and tensor data into matrix factors by optimizing a least square objective function using iterative updating algorithms, e.g. HOSVD (High Order Singular Value Decomposition) and ParaFac (Parallel Factors). One fundamental problem of these algorithms remains unsolved: are the solutions found by these algorithms global optimal? Surprisingly, we provide a positive answer for HSOVD and negative answer for ParaFac by combining theoretical analysis and experimental evidence. Our discoveries of this intrinsic property of HOSVD assure us that in real world applications HOSVD provides repeatable and reliable results.