Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
Variational methods for the Dirichlet process
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Extensions of compressed sensing
Signal Processing - Sparse approximations in signal and image processing
IEEE Transactions on Signal Processing
An Empirical Bayesian Strategy for Solving the Simultaneous Sparse Approximation Problem
IEEE Transactions on Signal Processing - Part II
IEEE Transactions on Signal Processing
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Exploiting structure in wavelet-based Bayesian compressive sensing
IEEE Transactions on Signal Processing
Learning output kernels for multi-task problems
Neurocomputing
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Hi-index | 0.00 |
Compressive sensing (CS) is an emerging £eld that, under appropriate conditions, can signi£cantly reduce the number of measurements required for a given signal. In many applications, one is interested in multiple signals that may be measured in multiple CS-type measurements, where here each signal corresponds to a sensing "task". In this paper we propose a novel multitask compressive sensing framework based on a Bayesian formalism, where a Dirichlet process (DP) prior is employed, yielding a principled means of simultaneously inferring the appropriate sharing mechanisms as well as CS inversion for each task. A variational Bayesian (VB) inference algorithm is employed to estimate the full posterior on the model parameters.