On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
A New Algorithm for Estimating the Effective Dimension-Reduction Subspace
The Journal of Machine Learning Research
Restricted Eigenvalue Properties for Correlated Gaussian Designs
The Journal of Machine Learning Research
Sparse regression learning by aggregation and Langevin Monte-Carlo
Journal of Computer and System Sciences
Estimation in linear regression models with measurement errors subject to single-indexed distortion
Computational Statistics & Data Analysis
Hi-index | 0.00 |
As promising alternatives to the LASSO, non-convex penalized methods, such as the SCAD and the minimax concave penalty method, produce asymptotically unbiased shrinkage estimates. By adopting non-convex penalties, in this paper we investigate uniformly variable selection and shrinkage estimation for several parametric and semi-parametric models with single-index structure. The new method does not need to estimate the involved nonparametric transformation or link function. The resulting estimators enjoy the oracle property even in the ''large p, small n'' scenario. The theoretical results for linear models are in parallel extended to general single-index models with no distribution constraint for the error at the cost of mild conditions on the predictors. Simulation studies are carried out to examine the performance of the proposed method and a real data analysis is also presented for illustration.