An Introduction to Variational Methods for Graphical Models
Machine Learning
Mean-field approaches to independent component analysis
Neural Computation
On the Slow Convergence of EM and VBEM in Low-Noise Linear Models
Neural Computation
Gaussian Processes for Classification: Mean-Field Algorithms
Neural Computation
A fast learning algorithm for deep belief nets
Neural Computation
Expectation Consistent Approximate Inference
The Journal of Machine Learning Research
State-Space Models: From the EM Algorithm to a Gradient Approach
Neural Computation
Nonnegative matrix factorization with Gaussian process priors
Computational Intelligence and Neuroscience - Advances in Nonnegative Matrix and Tensor Factorization
Bayesian inference for nonnegative matrix factorisation models
Computational Intelligence and Neuroscience
Hi-index | 0.00 |
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine of the method are two mean field techniques-the variational Bayes and the expectation consistent framework-and the cost function relating to these methods are optimized using the adaptive overrelaxed expectation maximization (EM) algorithm and the easy gradient recipe. The entire framework, implemented in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization.