On distance scales, embeddings, and efficient relaxations of the cut cone

  • Authors:
  • James R. Lee

  • Affiliations:
  • University of California, Berkeley

  • Venue:
  • SODA '05 Proceedings of the sixteenth annual ACM-SIAM symposium on Discrete algorithms
  • Year:
  • 2005

Quantified Score

Hi-index 0.02

Visualization

Abstract

A central open problem in the field of finite metric spaces is to find an efficient relaxation of the cut cone---the collection of positive linear combinations of cut pseudo-metrics on a finite set. In particular, it has been asked how well squared-Euclidean metrics (the so-called metrics of "negative type") embed into L1, and it is known that the answer to this question coincides with the integrality gap of a folklore semi-definite relaxation for computing the Sparsest Cut of a graph.Bourgain's classical embedding theorem implies that any n-point metric space embeds into L2 with O(log n) distortion. We give the first embeddings for metrics of negative type which beat Bourgain's bound. Specifically, we show that for every ∈ 0, there exists a δ 0 such that every n-point metric of negative type embeds into L2+∈, with distortion O(log n)1-δ. We also exhibit the first o(log n) bounds on the Euclidean distortion of finite subsets of Lp, for 1 p L1 and L2, and thus provide a necessary first step in resolving the long-standing open question on the Euclidean distortion of finite subsets of L1.In proving these results, we introduce a number of new techniques for the construction of low-distortion embeddings. These include a generic Gluing Lemma which avoids the overhead that typically arises from the naïve concatenation of different scales, and which provides new insights into the cut structure of finite graphs. We also exhibit the utility of Lipschitz extension theorems from Functional Analysis to the embedding of finite metric spaces. Finally, we prove the "Big Core" Theorem---a significantly improved and quantitatively optimal version of the main structural theorem in [ARV04] about random projections. The latter result offers a simplified hyperplane rounding algorithm for the computation of an O(√logn)-approximation to the Sparsest Cut problem with uniform demands.