Condition numbers of random matrices
Journal of Complexity
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Quantitative Robust Uncertainty Principles and Optimally Sparse Decompositions
Foundations of Computational Mathematics
Algorithms for simultaneous sparse approximation: part I: Greedy pursuit
Signal Processing - Sparse approximations in signal and image processing
Algorithms for simultaneous sparse approximation: part II: Convex relaxation
Signal Processing - Sparse approximations in signal and image processing
Recovery Algorithms for Vector-Valued Data with Joint Sparsity Constraints
SIAM Journal on Numerical Analysis
Blind multiband signal reconstruction: compressed sensing for analog signals
IEEE Transactions on Signal Processing
Compressed sensing of analog signals in shift-invariant spaces
IEEE Transactions on Signal Processing
Exact Matrix Completion via Convex Optimization
Foundations of Computational Mathematics
Stability and Instance Optimality for Gaussian Measurements in Compressed Sensing
Foundations of Computational Mathematics
Robust recovery of signals from a structured union of subspaces
IEEE Transactions on Information Theory
Block-sparse signals: uncertainty relations and efficient recovery
IEEE Transactions on Signal Processing
Theoretical Results on Sparse Representations of Multiple-Measurement Vectors
IEEE Transactions on Signal Processing
Reduce and Boost: Recovering Arbitrary Sets of Jointly Sparse Vectors
IEEE Transactions on Signal Processing - Part I
Sparse solutions to linear inverse problems with multiple measurement vectors
IEEE Transactions on Signal Processing
Uncertainty principles and ideal atomic decomposition
IEEE Transactions on Information Theory
On sparse representations in arbitrary redundant bases
IEEE Transactions on Information Theory
Greed is good: algorithmic results for sparse approximation
IEEE Transactions on Information Theory
Recovery of short, complex linear combinations via ℓ1 minimization
IEEE Transactions on Information Theory
Decoding by linear programming
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
IEEE Transactions on Information Theory
Compressed Sensing and Redundant Dictionaries
IEEE Transactions on Information Theory
Stability Results for Random Sampling of Sparse Trigonometric Polynomials
IEEE Transactions on Information Theory
Robust recovery of signals from a structured union of subspaces
IEEE Transactions on Information Theory
Uncertainty relations for shift-invariant analog signals
IEEE Transactions on Information Theory
Block-sparse signals: uncertainty relations and efficient recovery
IEEE Transactions on Signal Processing
Coherence-based performance guarantees for estimating a sparse vector under random noise
IEEE Transactions on Signal Processing
A multichannel spatial compressed sensing approach for direction of arrival estimation
LVA/ICA'10 Proceedings of the 9th international conference on Latent variable analysis and signal separation
Compressed Sensing via Dimension Spread in Dimension-Restricted Systems
Wireless Personal Communications: An International Journal
Hi-index | 754.96 |
This paper considers recovery of jointly sparse multichannel signals from incomplete measurements. Several approaches have been developed to recover the unknown sparse vectors from the given observations, including thresholding, simultaneous orthogonal matching pursuit (SOMP), and convex relaxation based on a mixed matrix norm. Typically, worst case analysis is carried out in order to analyze conditions under which the algorithms are able to recover any jointly sparse set of vectors. However, such an approach is not able to provide insights into why joint sparse recovery is superior to applying standard sparse reconstruction methods to each channel individually. Previous work considered an average case analysis of thresholding and SOMP by imposing a probability model on the measured signals. Here, the main focus is on analysis of convex relaxation techniques. In particular, the mixed l2,1 approach to multichannel recovery is investigated. Under a very mild condition on the sparsity and on the dictionary characteristics, measured for example by the coherence, it is shown that the probability of recovery failure decays exponentially in the number of channels. This demonstrates that most of the time, multichannel sparse recovery is indeed superior to single channel methods. The probability bounds are valid and meaningful even for a small number of signals. Using the tools developed to analyze the convex relaxation technique, also previous bounds for thresholding and SOMP recovery are tightened.