Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Learning in graphical models
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Algorithms for sparse nonnegative tucker decompositions
Neural Computation
A Unified View of Matrix Factorization Models
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Unsupervised Multiway Data Analysis: A Literature Survey
IEEE Transactions on Knowledge and Data Engineering
Graphical Models, Exponential Families, and Variational Inference
Foundations and Trends® in Machine Learning
Bayesian inference for nonnegative matrix factorisation models
Computational Intelligence and Neuroscience
Multi-HDP: a non parametric Bayesian model for tensor factorization
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 3
Tensor Decompositions and Applications
SIAM Review
Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation
Hi-index | 0.08 |
We propose a general probabilistic framework for modelling multiway data. Our approach establishes a novel link between graphical representation of probability measures and tensor factorization models that allow us to design arbitrary tensor factorization models while retaining simplicity. Using an expectation-maximization (EM) approach for maximizing the likelihood of the exponential dispersion models (EDM), we obtain iterative update equations for Kullback-Leibler (KL), Euclidian (EU) or Itakura-Saito (IS) costs as special cases. Besides EM, we derive alternative algorithms with multiplicative update rules (MUR) and alternating projections. We also provide algorithms for MAP estimation with conjugate priors. All of the algorithms can be formulated as message passing algorithm on a graph where vertices correspond to indices and cliques represent factors of the tensor decomposition.