Learning in graphical models
Clustering with Bregman Divergences
The Journal of Machine Learning Research
Algorithms for sparse nonnegative tucker decompositions
Neural Computation
A Unified View of Matrix Factorization Models
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Unsupervised Multiway Data Analysis: A Literature Survey
IEEE Transactions on Knowledge and Data Engineering
Bayesian inference for nonnegative matrix factorisation models
Computational Intelligence and Neuroscience
Tensor Decompositions and Applications
SIAM Review
Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation
Hi-index | 0.00 |
We develop a probabilistic framework for multiway analysis of high dimensional datasets. By exploiting a link between graphical models and tensor factorization models we can realize any arbitrary tensor factorization structure, and many popular models such as CP or TUCKER models with Euclidean error and their non-negative variants with KL error appear as special cases. Due to the duality between exponential families and Bregman divergences, we can cast the problem as inference in a model with Gaussian or Poisson components, where tensor factorisation reduces to a parameter estimation problem. We derive the generic form of update equations for multiplicative and alternating least squares. We also propose a straightforward matricisation procedure to convert element-wise equations into the matrix forms to ease implementation and parallelisation.