The predictive Lasso

  • Authors:
  • Minh-Ngoc Tran;David J. Nott;Chenlei Leng

  • Affiliations:
  • Department of Statistics and Applied Probability, National University of Singapore, Singapore, Singapore 117546 and Australian School of Business, University of New South Wales, Sydney, Australia ...;Department of Statistics and Applied Probability, National University of Singapore, Singapore, Singapore 117546;Department of Statistics and Applied Probability, National University of Singapore, Singapore, Singapore 117546

  • Venue:
  • Statistics and Computing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a shrinkage procedure for simultaneous variable selection and estimation in generalized linear models (GLMs) with an explicit predictive motivation. The procedure estimates the coefficients by minimizing the Kullback-Leibler divergence of a set of predictive distributions to the corresponding predictive distributions for the full model, subject to an l 1 constraint on the coefficient vector. This results in selection of a parsimonious model with similar predictive performance to the full model. Thanks to its similar form to the original Lasso problem for GLMs, our procedure can benefit from available l 1-regularization path algorithms. Simulation studies and real data examples confirm the efficiency of our method in terms of predictive performance on future observations.