Better subset regression using the nonnegative garrote
Technometrics
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Convergence of a block coordinate descent method for nondifferentiable minimization
Journal of Optimization Theory and Applications
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
Robust and Accurate Cancer Classification with Gene Expression Profiling
CSB '05 Proceedings of the 2005 IEEE Computational Systems Bioinformatics Conference
Regularized Least Absolute Deviations Regression and an Efficient Algorithm for Parameter Tuning
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
An Interior-Point Method for Large-Scale l1-Regularized Logistic Regression
The Journal of Machine Learning Research
IEEE Transactions on Information Theory
Hi-index | 0.03 |
This paper explores a fast algorithm to select relevant predictors for the response process with panel count data. Based on the lasso penalized pseudo-objective function derived from an estimating equation, the coordinate ascent accelerates the estimation of regression coefficients. The coordinate ascent algorithm is capable of selecting relevant predictors for underdetermined problems where the number of predictors far exceeds the number of cases. It relies on a tuning constant that can be chosen by generalized cross-validation. Our tests on simulated and real data demonstrate the virtue of penalized regression in model building and prediction for panel count data in ultrahigh-dimensional settings.