Dimension reduction for ultrametrics
SODA '04 Proceedings of the fifteenth annual ACM-SIAM symposium on Discrete algorithms
Euclidean distortion and the sparsest cut
Proceedings of the thirty-seventh annual ACM symposium on Theory of computing
Ordinal embeddings of minimum relaxation: general properties, trees, and ultrametrics
SODA '05 Proceedings of the sixteenth annual ACM-SIAM symposium on Discrete algorithms
Metric Embeddings with Relaxed Guarantees
FOCS '05 Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science
Plane embeddings of planar graph metrics
Proceedings of the twenty-second annual symposium on Computational geometry
Stable distributions, pseudorandom generators, embeddings, and data stream computation
Journal of the ACM (JACM)
Metric structures in L1: dimension, snowflakes, and average distortion
European Journal of Combinatorics
Nearest-neighbor-preserving embeddings
ACM Transactions on Algorithms (TALG)
Very sparse stable random projections for dimension reduction in lα (0
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Compressing large boolean matrices using reordering techniques
VLDB '04 Proceedings of the Thirtieth international conference on Very large data bases - Volume 30
Estimators and tail bounds for dimension reduction in lα (0
Proceedings of the nineteenth annual ACM-SIAM symposium on Discrete algorithms
Ultra-low-dimensional embeddings for doubling metrics
Proceedings of the nineteenth annual ACM-SIAM symposium on Discrete algorithms
Earth mover distance over high-dimensional spaces
Proceedings of the nineteenth annual ACM-SIAM symposium on Discrete algorithms
Sketching information divergences
Machine Learning
Ordinal embeddings of minimum relaxation: General properties, trees, and ultrametrics
ACM Transactions on Algorithms (TALG)
Approximating edit distance in near-linear time
Proceedings of the forty-first annual ACM symposium on Theory of computing
Sketching information divergences
COLT'07 Proceedings of the 20th annual conference on Learning theory
Bandwidth and low dimensional embedding
APPROX'11/RANDOM'11 Proceedings of the 14th international workshop and 15th international conference on Approximation, randomization, and combinatorial optimization: algorithms and techniques
Bandwidth and low dimensional embedding
Theoretical Computer Science
Hi-index | 0.00 |
The Johnson-Lindenstrauss Lemma shows that any n points in Euclidean space (with distances measured by the \ell norm) may be mapped down to 0((\log n)/\varepsilon) dimensions such that no pairwise distance is distorted by more than a (1+ \varepsilon) factor. Determining whether such dimension reduction is possible in \ell has been an intriguing open question. We show strong lower bounds for general dimension reduction in \ell. We give an explicit family of n points in \ell such that any embedding with distortion \delta requires n^{/\delta ^2} dimensions. This proves that there is no analog of the Johnson-LindenstraussLemma for \ell; in fact embedding with any constant distortion requires n dimensions. Further, embedding the points into \ell with 1 + \varepsilon distortion requires n^{\frac{1}{2} - 0\varepsilon \frac{1}{\varepsilon}} dimensions. Our proof establishes this lower bound for shortest path metrics of series-parallel graphs. We make extensive use of linear programmingand duality in devising our bounds. We expect that the tools and techniques we develop will be useful for future investigations of embeddings into \ell.