SIAM Journal on Computing
Fast algorithms for discrete polynomial transforms
Mathematics of Computation
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Fast algorithms for spherical harmonic expansions, II
Journal of Computational Physics
New bounds for restricted isometry constants
IEEE Transactions on Information Theory
On the complexity of mumford-shah-type regularization, viewed as a relaxed sparsity constraint
IEEE Transactions on Image Processing - Special section on distributed camera networks: sensing, processing, communication, and implementation
The Gelfand widths of lp-balls for 0
Journal of Complexity
A non-adapted sparse approximation of PDEs with stochastic inputs
Journal of Computational Physics
Dictionary Preconditioning for Greedy Algorithms
IEEE Transactions on Signal Processing
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
IEEE Transactions on Information Theory
Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
IEEE Transactions on Information Theory
Compressed Sensing and Redundant Dictionaries
IEEE Transactions on Information Theory
Representation of sparse Legendre expansions
Journal of Symbolic Computation
Hi-index | 0.00 |
We consider the problem of recovering polynomials that are sparse with respect to the basis of Legendre polynomials from a small number of random samples. In particular, we show that a Legendre s-sparse polynomial of maximal degree N can be recovered from m@?slog^4(N) random samples that are chosen independently according to the Chebyshev probability measure d@n(x)=@p^-^1(1-x^2)^-^1^/^2dx. As an efficient recovery method, @?"1-minimization can be used. We establish these results by verifying the restricted isometry property of a preconditioned random Legendre matrix. We then extend these results to a large class of orthogonal polynomial systems, including the Jacobi polynomials, of which the Legendre polynomials are a special case. Finally, we transpose these results into the setting of approximate recovery for functions in certain infinite-dimensional function spaces.