Dimensionality reduction for similarity searching in dynamic databases
SIGMOD '98 Proceedings of the 1998 ACM SIGMOD international conference on Management of data
A Multilinear Singular Value Decomposition
SIAM Journal on Matrix Analysis and Applications
The DBLP Computer Science Bibliography: Evolution, Research Issues, Perspectives
SPIRE 2002 Proceedings of the 9th International Symposium on String Processing and Information Retrieval
Beyond streams and graphs: dynamic tensor analysis
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Window-based Tensor Analysis on High-dimensional and Multi-aspect Streams
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Efficient MATLAB Computations with Sparse and Factored Tensors
SIAM Journal on Scientific Computing
Incremental tensor analysis: Theory and applications
ACM Transactions on Knowledge Discovery from Data (TKDD)
SIAM Journal on Matrix Analysis and Applications
Scalable Tensor Decompositions for Multi-aspect Data Mining
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
Tensor Decompositions and Applications
SIAM Review
Semi-supervised feature selection for graph classification
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Fast metadata-driven multiresolution tensor decomposition
Proceedings of the 20th ACM international conference on Information and knowledge management
Hi-index | 0.00 |
In large and complex graphs of social, chemical/biological, or other relations, frequent substructures are commonly shared by different graphs or by graphs evolving through different time periods. Tensors are natural representations of these complex time-evolving graph data. A factorization of a tensor provides a high-quality low-rank compact basis for each dimension of the tensor, which facilitates the interpretation of frequent substructures of the original graphs. However, the high computational cost of tensor factorization makes it infeasible for conventional tensor factorization methods to handle large graphs that evolve frequently with time. To address this problem, in this paper we propose a novel iterative tensor factorization (ITF) method whose time complexity is linear in the cardinalities of all dimensions of a tensor. This low time complexity means that when using tensors to represent dynamic graphs, the computational cost of ITF is linear in the size (number of edges/vertices) of graphs and is also linear in the number of time periods over which the graph evolves. More importantly, an error estimation of ITF suggests that its factorization correctness is comparable to that of the standard factorization method. We empirically evaluate our method on publication networks and chemical compound graphs, and demonstrate that ITF is an order of magnitude faster than the conventional method and at the same time preserves factorization quality. To the best of our knowledge, this research is the first work that uses important frequent substructures to speed up tensor factorizations for mining dynamic graphs.