A Fast Randomized Algorithm for Orthogonal Projection

  • Authors:
  • E. S. Coakley;V. Rokhlin;M. Tygert

  • Affiliations:
  • edouard.coakley@yale.edu and vladimir.rokhlin@yale.edu;-;tygert@aya.yale.edu

  • Venue:
  • SIAM Journal on Scientific Computing
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe an algorithm that, given any full-rank matrix $A$ having fewer rows than columns, can rapidly compute the orthogonal projection of any vector onto the null space of $A$, as well as the orthogonal projection onto the row space of $A$, provided that both $A$ and its adjoint $A^*$ can be applied rapidly to arbitrary vectors. As an intermediate step, the algorithm solves the overdetermined linear least-squares regression involving $A^*$ and may therefore be used for this purpose as well. In many circumstances, the technique can accelerate interior-point methods for convex optimization, including linear programming (see, for example, Chapter 11 of [S. J. Wright, Primal-Dual Interior-Point Methods, SIAM, Philadelphia, 1997]). The basis of the algorithm is an obvious but numerically unstable scheme (typically known as the method of normal equations); suitable use of a preconditioner yields numerical stability. We generate the preconditioner rapidly via a randomized procedure that succeeds with extremely high probability. We provide numerical examples demonstrating the superior accuracy of the randomized method over direct use of the normal equations.