The Johnson-Lindenstrauss Lemma and the sphericity of some graphs
Journal of Combinatorial Theory Series A
Database-friendly random projections: Johnson-Lindenstrauss with binary coins
Journal of Computer and System Sciences - Special issu on PODS 2001
Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform
Proceedings of the thirty-eighth annual ACM symposium on Theory of computing
Improved Approximation Algorithms for Large Matrices via Random Projections
FOCS '06 Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science
On variants of the Johnson–Lindenstrauss lemma
Random Structures & Algorithms
Dense Fast Random Projections and Lean Walsh Transforms
APPROX '08 / RANDOM '08 Proceedings of the 11th international workshop, APPROX 2008, and 12th international workshop, RANDOM 2008 on Approximation, Randomization and Combinatorial Optimization: Algorithms and Techniques
Communications of the ACM
Fast Dimension Reduction Using Rademacher Series on Dual BCH Codes
Discrete & Computational Geometry
A sparse Johnson: Lindenstrauss transform
Proceedings of the forty-second ACM symposium on Theory of computing
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Subspace embeddings for the L1-norm with applications
Proceedings of the forty-third annual ACM symposium on Theory of computing
Sparser Johnson-Lindenstrauss transforms
Proceedings of the twenty-third annual ACM-SIAM symposium on Discrete Algorithms
Randomized Algorithms for Matrices and Data
Foundations and Trends® in Machine Learning
Fast matrix rank algorithms and applications
STOC '12 Proceedings of the forty-fourth annual ACM symposium on Theory of computing
Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization
SIAM Journal on Optimization
Sketching via hashing: from heavy hitters to compressed sensing to sparse fourier transform
Proceedings of the 32nd symposium on Principles of database systems
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Sparsity lower bounds for dimensionality reducing maps
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Fast matrix rank algorithms and applications
Journal of the ACM (JACM)
Hi-index | 0.00 |
The problems of random projections and sparse reconstruction have much in common and individually received much attention. Surprisingly, until now they progressed in parallel and remained mostly separate. Here, we employ new tools from probability in Banach spaces that were successfully used in the context of sparse reconstruction to advance on an open problem in random pojection. In particular, we generalize and use an intricate result by Rudelson and Vershynin for sparse reconstruction which uses Dudley's theorem for bounding Gaussian processes. Our main result states that any set of N = exp (Õ(n)) real vectors in n dimensional space can be linearly mapped to a space of dimension k = O(log N polylog(n)), while (1) preserving the pair-wise distances among the vectors to within any constant distortion and (2) being able to apply the transformation in time O(n log n) on each vector. This improves on the best known N = exp(Õ(n1/2)) achieved by Ailon and Liberty and N = exp(Õ(n1/3)) by Ailon and Chazelle. The dependence in the distortion constant however is believed to be suboptimal and subject to further investigation. For constant distortion, this settles the open question posed by these authors up to a polylog(n) factor while considerably simplifying their constructions.