Gradient LASSO for feature selection
ICML '04 Proceedings of the twenty-first international conference on Machine learning
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
Generalized Additive Models (Texts in Statistical Science)
Generalized Additive Models (Texts in Statistical Science)
Bootstrap variants of the Akaike information criterion for mixed model selection
Computational Statistics & Data Analysis
Modern Applied Statistics with S
Modern Applied Statistics with S
Hi-index | 0.00 |
Generalized linear mixed models are a widely used tool for modeling longitudinal data. However, their use is typically restricted to few covariates, because the presence of many predictors yields unstable estimates. The presented approach to the fitting of generalized linear mixed models includes an L1-penalty term that enforces variable selection and shrinkage simultaneously. A gradient ascent algorithm is proposed that allows to maximize the penalized log-likelihood yielding models with reduced complexity. In contrast to common procedures it can be used in high-dimensional settings where a large number of potentially influential explanatory variables is available. The method is investigated in simulation studies and illustrated by use of real data sets.