Better subset regression using the nonnegative garrote
Technometrics
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
Model-based deconvolution of genome-wide DNA binding
Bioinformatics
Fast orthogonal least squares algorithm for efficient subset modelselection
IEEE Transactions on Signal Processing
A two-stage algorithm for identification of nonlinear dynamic systems
Automatica (Journal of IFAC)
Regressor selection with the analysis of variance method
Automatica (Journal of IFAC)
Hi-index | 22.14 |
In many situations, the number of data points is fixed, and the asymptotic convergence results of popular model selection tools may not be useful. A new algorithm for model selection, RIVAL (removing irrelevant variables amidst Lasso iterations), is presented and shown to be particularly effective for a large but fixed number of data points. The algorithm is motivated by an application of nuclear material detection where all unknown parameters are to be non-negative. Thus, positive Lasso and its variants are analyzed. Then, RIVAL is proposed and is shown to have some desirable properties, namely the number of data points needed to have convergence is smaller than existing methods.