Learning decision trees using the Fourier spectrum
SIAM Journal on Computing
Randomized Interpolation and Approximationof Sparse Polynomials
SIAM Journal on Computing
Selective families, superimposed codes, and broadcasting on unknown radio networks
SODA '01 Proceedings of the twelfth annual ACM-SIAM symposium on Discrete algorithms
Near-optimal sparse fourier representations via sampling
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
Fast, small-space algorithms for approximate histogram maintenance
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
Explicit constructions of selectors and related combinatorial structures, with applications
SODA '02 Proceedings of the thirteenth annual ACM-SIAM symposium on Discrete algorithms
What's hot and what's not: tracking most frequent items dynamically
Proceedings of the twenty-second ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
Proving Hard-Core Predicates Using List Decoding
FOCS '03 Proceedings of the 44th Annual IEEE Symposium on Foundations of Computer Science
Extensions of compressed sensing
Signal Processing - Sparse approximations in signal and image processing
One sketch for all: fast algorithms for compressed sensing
Proceedings of the thirty-ninth annual ACM symposium on Theory of computing
Sketching in adversarial environments
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Information Processing Letters
On the reconstruction of block-sparse signals with an optimal number of measurements
IEEE Transactions on Signal Processing
Bayesian compressive sensing via belief propagation
IEEE Transactions on Signal Processing
Superselectors: efficient constructions and applications
ESA'10 Proceedings of the 18th annual European conference on Algorithms: Part I
K-median clustering, model-based compressive sensing, and sparse recovery for earth mover distance
Proceedings of the forty-third annual ACM symposium on Theory of computing
Approximate Sparse Recovery: Optimizing Time and Measurements
SIAM Journal on Computing
On the Design of Deterministic Matrices for Fast Recovery of Fourier Compressible Functions
SIAM Journal on Matrix Analysis and Applications
Sketching in Adversarial Environments
SIAM Journal on Computing
Strengthening hash families and compressive sensing
Journal of Discrete Algorithms
A Nonparametric Approach to Modeling Choice with Limited Data
Management Science
Sketching via hashing: from heavy hitters to compressed sensing to sparse fourier transform
Proceedings of the 32nd symposium on Principles of database systems
Hi-index | 0.01 |
In sparse approximation theory, the fundamental problem is to reconstruct a signal A∈ℝn from linear measurements 〈Aψi〉 with respect to a dictionary of ψi's. Recently, there is focus on the novel direction of Compressed Sensing [9] where the reconstruction can be done with very few—O(k logn)—linear measurements over a modified dictionary if the signal is compressible, that is, its information is concentrated in k coefficients with the original dictionary. In particular, these results [9, 4, 23] prove that there exists a single O(k logn) ×n measurement matrix such that any such signal can be reconstructed from these measurements, with error at most O(1) times the worst case error for the class of such signals. Compressed sensing has generated tremendous excitement both because of the sophisticated underlying Mathematics and because of its potential applications In this paper, we address outstanding open problems in Compressed Sensing. Our main result is an explicit construction of a non-adaptive measurement matrix and the corresponding reconstruction algorithm so that with a number of measurements polynomial in k, logn, 1/ε, we can reconstruct compressible signals. This is the first known polynomial time explicit construction of any such measurement matrix. In addition, our result improves the error guarantee from O(1) to 1 + ε and improves the reconstruction time from poly(n) to poly(k logn) Our second result is a randomized construction of O(kpolylog (n)) measurements that work for each signal with high probability and gives per-instance approximation guarantees rather than over the class of all signals. Previous work on Compressed Sensing does not provide such per-instance approximation guarantees; our result improves the best known number of measurements known from prior work in other areas including Learning Theory [20, 21], Streaming algorithms [11, 12, 6] and Complexity Theory [1] for this case Our approach is combinatorial. In particular, we use two parallel sets of group tests, one to filter and the other to certify and estimate; the resulting algorithms are quite simple to implement