Bayesian deconvolution of Bernoulli-Gaussian processes
Signal Processing
An application of MCMC methods for the multiple change-points problem
Signal Processing - Special section on Markov Chain Monte Carlo (MCMC) methods for signal processing
Uniform sampling on simplices and spheres
ACM SIGAPL APL Quote Quad
Non-negative Matrix Factorization with Sparseness Constraints
The Journal of Machine Learning Research
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Compression of facial images using the K-SVD algorithm
Journal of Visual Communication and Image Representation
Bayesian Core: A Practical Approach to Computational Bayesian Statistics
Bayesian Core: A Practical Approach to Computational Bayesian Statistics
Optimal Solutions for Sparse Principal Component Analysis
The Journal of Machine Learning Research
Sparse source separation from orthogonal mixtures
ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
Hierarchical Bayesian sparse image reconstruction with application to MRFM
IEEE Transactions on Image Processing
Sparse image reconstruction for molecular imaging
IEEE Transactions on Image Processing
Joint Bayesian endmember extraction and linear unmixing for hyperspectral imagery
IEEE Transactions on Signal Processing
Bayesian curve fitting using MCMC with applications to signalsegmentation
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation
IEEE Transactions on Signal Processing
Sparse signal reconstruction from limited data using FOCUSS: are-weighted minimum norm algorithm
IEEE Transactions on Signal Processing
Monte Carlo Methods for Adaptive Sparse Approximations of Time-Series
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing - Part I
Unsupervised deconvolution of sparse spike trains using stochasticapproximation
IEEE Transactions on Signal Processing
A Bayesian Approach for Blind Separation of Sparse Sources
IEEE Transactions on Audio, Speech, and Language Processing
Sparse Linear Regression With Structured Priors and Application to Denoising of Musical Audio
IEEE Transactions on Audio, Speech, and Language Processing
Uncertainty principles and ideal atomic decomposition
IEEE Transactions on Information Theory
Sparse representations in unions of bases
IEEE Transactions on Information Theory
Greed is good: algorithmic results for sparse approximation
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Sparse Representation for Color Image Restoration
IEEE Transactions on Image Processing
Bayesian compressive sensing for cluster structured sparse signals
Signal Processing
A novel predual dictionary learning algorithm
Journal of Visual Communication and Image Representation
Noise enhancement and SNR equivalence in Bernoulli nonuniform sampling
Proceedings of the 4th International Conference on Cognitive Radio and Advanced Spectrum Management
An augmented Lagrangian approach to general dictionary learning for image denoising
Journal of Visual Communication and Image Representation
Hi-index | 35.68 |
This paper addresses the problem of identifying a lower dimensional space where observed data can be sparsely represented. This undercomplete dictionary learning task can be formulated as a blind separation problem of sparse sources linearly mixed with an unknown orthogonal mixing matrix. This issue is formulated in a Bayesian framework. First, the unknown sparse sources are modeled as Bernoulli-Gaussian processes. To promote sparsity, a weighted mixture of an atom at zero and a Gaussian distribution is proposed as prior distribution for the unobserved sources. A noninformative prior distribution defined on an appropriate Stiefel manifold is elected for the mixing matrix. The Bayesian inference on the unknown parameters is conducted using a Markov chain Monte Carlo (MCMC) method. A partially collapsed Gibbs sampler is designed to generate samples asymptotically distributed according to the joint posterior distribution of the unknown model parameters and hyperparameters. These samples are then used to approximate the joint maximum a posteriori estimator of the sources and mixing matrix. Simulations conducted on synthetic data are reported to illustrate the performance of the method for recovering sparse representations. An application to sparse coding on undercomplete dictionary is finally investigated.