OSNAP: Faster Numerical Linear Algebra Algorithms via Sparser Subspace Embeddings

  • Authors:
  • Jelani Nelson;Huy L. Nguyên

  • Affiliations:
  • -;-

  • Venue:
  • FOCS '13 Proceedings of the 2013 IEEE 54th Annual Symposium on Foundations of Computer Science
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

An oblivious subspace embedding (OSE) given some parameters ε, d is a distribution D over matrices π ε &Roph;mxn such that for any linear subspace W &Roph;n with dim(W) = d, Pr_π∼ D for all x in W|π x|_2 in (1pm ε)|x|_2) 2/3. We show that a certain class of distributions, Oblivious Sparse Norm-Approximating Projections (OSNAPs), provides OSE's with m = O(d1+γ/ε2), and where every matrix π in the support of the OSE has only s = O_γ(1/ε) non-zero entries per column, for γ0 any desired constant. Plugging OSNAPs into known algorithms for approximate least squares regression, lp regression, low rank approximation, and approximating leverage scores implies faster algorithms for all these problems. Our main result is essentially a Bai-Yin type theorem in random matrix theory and is likely to be of independent interest: we show that for any fixed U ε&Roph;nxd with orthonormal columns and random sparse π, all singular values of π U lie in [1-ε, 1+ε] with good probability. This can be seen as a generalization of the sparse Johnson-Linden Strauss lemma, which was concerned with d=1. Our methods also recover a slightly sharper version of a main result of [Clarkson-Woodruff, STOC 2013], with a much simpler proof. That is, we show that OSNAPs give an OSE with m = O(d2/ε2), s = 1.