Minimizing GCV/GML scores with multiple smoothing parameters via the Newton method
SIAM Journal on Scientific and Statistical Computing
Direct generalized additive modeling with penalized likelihood
Computational Statistics & Data Analysis
Outcomes of the equivalence of adaptive ridge with least absolute shrinkage
Proceedings of the 1998 conference on Advances in neural information processing systems II
An introduction to variable and feature selection
The Journal of Machine Learning Research
Practical variable selection for generalized additive models
Computational Statistics & Data Analysis
Hi-index | 0.03 |
A new method for function estimation and variable selection, specifically designed for additive models fitted by cubic splines is proposed. This new method involves regularizing additive models using the l"1-norm, which generalizes the lasso to the nonparametric setting. As in the linear case, it shrinks coefficients and produces some coefficients that are exactly zero. It gives parsimonious models, selects significant variables, and reveals nonlinearities in the effects of predictors. Two strategies for finding a parsimonious additive model solution are proposed. Both algorithms are based on a fixed point algorithm, combined with a singular value decomposition that considerably reduces computation. The empirical behavior of parsimonious additive models is compared to the adaptive backfitting BRUTO algorithm. The results allow to characterize the domains in which our approach is effective: it performs significantly better than BRUTO when model estimation is challenging. An implementation of this method is illustrated using real data from the Cophar 1 ANRS 102 trial. Parsimonious additive models are applied to predict the indinavir plasma concentration in HIV patients. Results suggest that this new method is a promising technique for the research and application areas.