Robust regression and Lasso

  • Authors:
  • Huan Xu;Constantine Caramanis;Shie Mannor

  • Affiliations:
  • Department of Electrical and Computer Engineering, The University of Texas, Austin, TX and Department of Electrical and Computer Engineering, McGill University, Montréal, Canada;Department of Electrical and Computer Engineering, The University of Texas, Austin, TX;Department of Electrical Engineering, Technion-Israel Institute of Technology, Technion City, Haifa, Israel and Department of Electrical and Computer Engineering, McGill University, Montréal, ...

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2010

Quantified Score

Hi-index 754.85

Visualization

Abstract

Lasso, or l1 regularized least squares, has been explored extensively for its remarkable sparsity properties. In this paper it is shown that the solution to Lasso, in addition to its sparsity, has robustness properties: it is the solution to a robust optimization problem. This has two important consequences. First, robustness provides a connection of the regularizer to a physical property, namely, protection from noise. This allows a principled selection of the regularizer, and in particular, generalizations of Lasso that also yield convex optimization problems are obtained by considering different uncertainty sets. Second, robustness can itself be used as an avenue for exploring different properties of the solution. In particular, it is shown that robustness of the solution explains why the solution is sparse. The analysis as well as the specific results obtained differ from standard sparsity results, providing different geometric intuition. Furthermore, it is shown that the robust optimization formulation is related to kernel density estimation, and based on this approach, a proof that Lasso is consistent is given, using robustness directly. Finally, a theorem is proved which states that sparsity and algorithmic stability contradict each other, and hence Lasso is not stable.