Eigenvalues and condition numbers of random matrices
SIAM Journal on Matrix Analysis and Applications
Sparse Approximate Solutions to Linear Systems
SIAM Journal on Computing
Atomic Decomposition by Basis Pursuit
SIAM Review
New Optimization Algorithms in Physics
New Optimization Algorithms in Physics
Phase Transitions in Combinatorial Optimization Problems - Basics, Algorithms and Statistical Mechanics
Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit
Foundations of Computational Mathematics
Probing the Pareto Frontier for Basis Pursuit Solutions
SIAM Journal on Scientific Computing
Bregman Iterative Algorithms for $\ell_1$-Minimization with Applications to Compressed Sensing
SIAM Journal on Imaging Sciences
Subspace pursuit for compressive sensing signal reconstruction
IEEE Transactions on Information Theory
Generalized Power Method for Sparse Principal Component Analysis
The Journal of Machine Learning Research
IEEE Transactions on Information Theory
Sparse representations in unions of bases
IEEE Transactions on Information Theory
Greed is good: algorithmic results for sparse approximation
IEEE Transactions on Information Theory
Decoding by linear programming
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
IEEE Transactions on Information Theory
Fast Solution of -Norm Minimization Problems When the Solution May Be Sparse
IEEE Transactions on Information Theory
Improved Bounds on Restricted Isometry Constants for Gaussian Matrices
SIAM Journal on Matrix Analysis and Applications
Hi-index | 0.00 |
Compressed sensing (CS) seeks to recover an unknown vector with $N$ entries by making far fewer than $N$ measurements; it posits that the number of CS measurements should be comparable to the information content of the vector, not simply $N$. CS combines directly the important task of compression with the measurement task. Since its introduction in 2004 there have been hundreds of papers on CS, a large fraction of which develop algorithms to recover a signal from its compressed measurements. Because of the paradoxical nature of CS—exact reconstruction from seemingly undersampled measurements—it is crucial for acceptance of an algorithm that rigorous analyses verify the degree of undersampling the algorithm permits. The restricted isometry property (RIP) has become the dominant tool used for the analysis in such cases. We present here an asymmetric form of RIP that gives tighter bounds than the usual symmetric one. We give the best known bounds on the RIP constants for matrices from the Gaussian ensemble. Our derivations illustrate the way in which the combinatorial nature of CS is controlled. Our quantitative bounds on the RIP allow precise statements as to how aggressively a signal can be undersampled, the essential question for practitioners. We also document the extent to which RIP gives precise information about the true performance limits of CS, by comparison with approaches from high-dimensional geometry.