A note on sparse least-squares regression

  • Authors:
  • Christos Boutsidis;Malik Magdon-Ismail

  • Affiliations:
  • Mathematical Sciences Department, IBM T.J. Watson Research Center, United States;Computer Science Department, Rensselaer Polytechnic Institute, United States

  • Venue:
  • Information Processing Letters
  • Year:
  • 2014

Quantified Score

Hi-index 0.89

Visualization

Abstract

We compute a sparse solution to the classical least-squares problem min"x@?Ax-b@?"2, where A is an arbitrary matrix. We describe a novel algorithm for this sparse least-squares problem. The algorithm operates as follows: first, it selects columns from A, and then solves a least-squares problem only with the selected columns. The column selection algorithm that we use is known to perform well for the well studied column subset selection problem. The contribution of this article is to show that it gives favorable results for sparse least-squares as well. Specifically, we prove that the solution vector obtained by our algorithm is close to the solution vector obtained via what is known as the ''SVD-truncated regularization approach''.