Learning with matrix factorizations
Learning with matrix factorizations
Non-negative tensor factorization with applications to statistics and computer vision
ICML '05 Proceedings of the 22nd international conference on Machine learning
Scalable Tensor Decompositions for Multi-aspect Data Mining
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
Large-scale collaborative prediction using a nonparametric random effects model
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
A spatio-temporal approach to collaborative filtering
Proceedings of the third ACM conference on Recommender systems
Tensor Decompositions and Applications
SIAM Review
Relation regularized matrix factorization
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Pairwise interaction tensor factorization for personalized tag recommendation
Proceedings of the third ACM international conference on Web search and data mining
Temporal Link Prediction Using Matrix and Tensor Factorizations
ACM Transactions on Knowledge Discovery from Data (TKDD)
Exponential Family Tensor Factorization for Missing-Values Prediction and Anomaly Detection
ICDM '10 Proceedings of the 2010 IEEE International Conference on Data Mining
Graph Regularized Nonnegative Matrix Factorization for Data Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
Most of the existing analysis methods for tensors (or multiway arrays) only assume that tensors to be completed are of low rank. However, for example, when they are applied to tensor completion problems, their prediction accuracy tends to be significantly worse when only limited entries are observed. In this paper, we propose to use relationships among data as auxiliary information in addition to the low-rank assumption to improve the quality of tensor decomposition. We introduce two regularization approaches using graph Laplacians induced from the relationships, and design iterative algorithms for approximate solutions. Numerical experiments on tensor completion using synthetic and benchmark datasets show that the use of auxiliary information improves completion accuracy over the existing methods based only on the low-rank assumption, especially when observations are sparse.