Online Model Selection Based on the Variational Bayes
Neural Computation
Building Blocks for Variational Bayesian Learning of Latent Variable Models
The Journal of Machine Learning Research
Lessons from the Netflix prize challenge
ACM SIGKDD Explorations Newsletter - Special issue on visual analytics
Bayesian probabilistic matrix factorization using Markov chain Monte Carlo
Proceedings of the 25th international conference on Machine learning
The Journal of Machine Learning Research
Factorization meets the neighborhood: a multifaceted collaborative filtering model
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Principal Component Analysis for Large Scale Problems with Lots of Missing Values
ECML '07 Proceedings of the 18th European conference on Machine Learning
Online Learning for Matrix Factorization and Sparse Coding
The Journal of Machine Learning Research
The power of convex relaxation: near-optimal matrix completion
IEEE Transactions on Information Theory
Active learning and search on low-rank matrices
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Hi-index | 0.00 |
The problem of large-scale online matrix completion is addressed via a Bayesian approach. The proposed method learns a factor analysis (FA) model for large matrices, based on a small number of observed matrix elements, and leverages the statistical model to actively select which new matrix entries/observations would be most informative if they could be acquired, to improve the model; the model inference and active learning are performed in an online setting. In the context of online learning, a greedy, fast and provably near-optimal algorithm is employed to sequentially maximize the mutual information between past and future observations, taking advantage of submodularity properties. Additionally, a simpler procedure, which directly uses the posterior parameters learned by the Bayesian approach, is shown to achieve slightly lower estimation quality, with far less computational effort. Inference is performed using a computationally efficient online variational Bayes (VB) procedure. Competitive results are obtained in a very large collaborative filtering problem, namely the Yahoo! Music ratings dataset.