A vector space model for automatic indexing
Communications of the ACM
A relational model of data for large shared data banks
Communications of the ACM
Scalable Tensor Decompositions for Multi-aspect Data Mining
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
Tensor Decompositions and Applications
SIAM Review
FacetCube: a framework of incorporating prior knowledge into non-negative tensor factorization
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
RanKloud: scalable multimedia and social media retrieval and analysis in the cloud
Proceedings of the 9th workshop on Large-scale and distributed informational retrieval
Proceedings of the 21st ACM international conference on Information and knowledge management
Hi-index | 0.00 |
In this paper, we first introduce a tensor-based relational data model and define algebraic operations on this model. We note that, while in traditional relational algebraic systems the join operation tends to be the costliest operation of all, in the tensor-relational framework presented here, tensor decomposition becomes the computationally costliest operation. Therefore, we consider optimization of tensor decomposition operations within a relational algebraic framework. This leads to a highly efficient, effective, and easy-to-parallelize join-by-decomposition approach and a corresponding KL-divergence based optimization strategy. Experimental results provide evidence that minimizing KL-divergence within the proposed join-by-decomposition helps approximate the conventional join-then-decompose scheme well, without the associated time and space costs.