Stability and Instance Optimality for Gaussian Measurements in Compressed Sensing
Foundations of Computational Mathematics
A note on the complexity of L p minimization
Mathematical Programming: Series A and B - Special Issue on Large Scale Optimization: Analysis, Algorithms and Applications
Sparse representations in unions of bases
IEEE Transactions on Information Theory
Decoding by linear programming
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
IEEE Transactions on Information Theory
A reweighted nuclear norm minimization algorithm for low rank matrix recovery
Journal of Computational and Applied Mathematics
Hi-index | 0.00 |
Compressed sensing is a new scheme which shows the ability to recover sparse signal from fewer measurements, using l 1 minimization. Recently, Chartrand and Staneva showed in Chartrand and Staneva (Inverse Problems 24:1---14, 2009) that the l p minimization with 0驴p驴l 1 minimization. They proved that l p minimization with 0驴p驴S-sparse signals x驴驴驴驴 N from fewer Gaussian random measurements for some smaller p with probability exceeding $$ 1 - 1 \Bigg/ {N\choose S}. $$ The first aim of this paper is to show that above result is right for the case of Gaussian random measurements with probability exceeding 1驴驴驴2e 驴驴驴c(p)M , where M is the numbers of rows of Gaussian random measurements and c(p) is a positive constant that guarantees $1-2e^{-c(p)M}1 - 1 / {N\choose S}$ for p smaller. The second purpose of the paper is to show that under certain weaker conditions, decoders Δ p are stable in the sense that they are (2,p) instance optimal for a large class of encoder for 0驴p驴