Elements of information theory
Elements of information theory
Suppressing random walks in Markov chain Monte Carlo using ordered overrelaxation
Learning in graphical models
A view of the EM algorithm that justifies incremental, sparse, and other variants
Learning in graphical models
Neural Computation
Tractable variational structures for approximating graphical models
Proceedings of the 1998 conference on Advances in neural information processing systems II
On sequential Monte Carlo sampling methods for Bayesian filtering
Statistics and Computing
Mean-field approaches to independent component analysis
Neural Computation
Variational Approximations between Mean Field Theory and the Junction Tree Algorithm
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Variational Bayesian learning of ICA with missing data
Neural Computation
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
The Journal of Machine Learning Research
On the Effect of the Form of the Posterior Approximation in Variational Learning of ICA Models
Neural Processing Letters
On the Slow Convergence of EM and VBEM in Low-Noise Linear Models
Neural Computation
A Variational Method for Learning Sparse and Overcomplete Representations
Neural Computation
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Expectation Consistent Approximate Inference
The Journal of Machine Learning Research
Blind separation of disjoint orthogonal signals: demixing N sources from 2 mixtures
ICASSP '00 Proceedings of the Acoustics, Speech, and Signal Processing, 2000. on IEEE International Conference - Volume 05
Nonparametric belief propagation
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
IEEE Transactions on Signal Processing
Performance measurement in blind audio source separation
IEEE Transactions on Audio, Speech, and Language Processing
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
Underdetermined blind source separation based on subspace representation
IEEE Transactions on Signal Processing
Underdetermined blind separation of non-sparse sources using spatial time-frequency distributions
Digital Signal Processing
Gamma Markov random fields for audio source modeling
IEEE Transactions on Audio, Speech, and Language Processing
Conjugate gamma Markov random fields for modelling nonstationary sources
ICA'07 Proceedings of the 7th international conference on Independent component analysis and signal separation
Hi-index | 0.00 |
We tackle the general linear instantaneous model (possibly underdetermined and noisy) where we model the source prior with a Student t distribution. The conjugate-exponential characterisation of the t distribution as an infinite mixture of scaled Gaussians enables us to do efficient inference. We study two well-known inference methods, Gibbs sampler and variational Bayes for Bayesian source separation. We derive both techniques as local message passing algorithms to highlight their algorithmic similarities and to contrast their different convergence characteristics and computational requirements. Our simulation results suggest that typical posterior distributions in source separation have multiple local maxima. Therefore we propose a hybrid approach where we explore the state space with a Gibbs sampler and then switch to a deterministic algorithm. This approach seems to be able to combine the speed of the variational approach with the robustness of the Gibbs sampler.