Approximate tensor decomposition within a tensor-relational algebraic framework

  • Authors:
  • Mijung Kim;Kasim Selçuk Candan

  • Affiliations:
  • Arizona State University, Tempe, AZ, USA;Arizona State University, Tempe, AZ, USA

  • Venue:
  • Proceedings of the 20th ACM international conference on Information and knowledge management
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we first introduce a tensor-based relational data model and define algebraic operations on this model. We note that, while in traditional relational algebraic systems the join operation tends to be the costliest operation of all, in the tensor-relational framework presented here, tensor decomposition becomes the computationally costliest operation. Therefore, we consider optimization of tensor decomposition operations within a relational algebraic framework. This leads to a highly efficient, effective, and easy-to-parallelize join-by-decomposition approach and a corresponding KL-divergence based optimization strategy. Experimental results provide evidence that minimizing KL-divergence within the proposed join-by-decomposition helps approximate the conventional join-then-decompose scheme well, without the associated time and space costs.