Distributed sparse linear regression

  • Authors:
  • Gonzalo Mateos;Juan Andrés Bazerque;Georgios B. Giannakis

  • Affiliations:
  • Department of Electrical and Computer Engineering, University of Minnesota, Minneapolis, MN;Department of Electrical and Computer Engineering, University of Minnesota, Minneapolis, MN;Department of Electrical and Computer Engineering, University of Minnesota, Minneapolis, MN

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2010

Quantified Score

Hi-index 35.69

Visualization

Abstract

The Lasso is a popular technique for joint estimation and continuous variable selection, especially well-suited for sparse and possibly under-determined linear regression problems. This paper develops algorithms to estimate the regression coefficients via Lasso when the training data are distributed across different agents, and their communication to a central processing unit is prohibited for e.g., communication cost or privacy reasons. A motivating application is explored in the context of wireless communications, whereby sensing cognitive radios collaborate to estimate the radio-frequency power spectrum density. Attaining different tradeoffs between complexity and convergence speed, three novel algorithms are obtained after reformulating the Lasso into a separable form, which is iteratively minimized using the alternating-direction method of multipliers so as to gain the desired degree of parallelization. Interestingly, the per agent estimate updates are given by simple soft-thresholding operations, and inter-agent communication overhead remains at affordable level. Without exchanging elements from the different training sets, the local estimates consent to the global Lasso solution, i.e., the fit that would be obtained if the entire data set were centrally available. Numerical experiments with both simulated and real data demonstrate the merits of the proposed distributed schemes, corroborating their convergence and global optimality. The ideas in this paper can be easily extended for the purpose of fitting related models in a distributed fashion, including the adaptive Lasso, elastic net, fused Lasso and nonnegative garrote.