Neural Computation
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Variational and stochastic inference for Bayesian source separation
Digital Signal Processing
Monte Carlo Strategies in Scientific Computing
Monte Carlo Strategies in Scientific Computing
Generative spectrogram factorization models for polyphonic piano transcription
IEEE Transactions on Audio, Speech, and Language Processing
Gamma Markov random fields for audio source modeling
IEEE Transactions on Audio, Speech, and Language Processing
Hi-index | 0.00 |
In modelling nonstationary sources, one possible strategy is to define a latent process of strictly positive variables to model variations in second order statistics of the underlying process. This can be achieved, for example, by passing a Gaussian process through a positive nonlinearity or defining a discrete state Markov chain where each state encodes a certain regime. However, models with such constructs turn out to be either not very flexible or non-conjugate, making inference somewhat harder. In this paper, we introduce a conjugate (inverse-) gamma Markov Random field model that allows random fluctuations on variances which are useful as priors for nonstationary time-frequency energy distributions. The main idea is to introduce auxiliary variables such that full conditional distributions and sufficient statistics are readily available as closed form expressions. This allows straightforward implementation of a Gibbs sampler or a variational algorithm. We illustrate our approach on denoising and single channel source separation.