On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
IEEE Transactions on Information Theory
Shifting inequality and recovery of sparse signals
IEEE Transactions on Signal Processing
Just relax: convex programming methods for identifying sparse signals in noise
IEEE Transactions on Information Theory
Sparse matrix inversion with scaled Lasso
The Journal of Machine Learning Research
Hi-index | 0.00 |
We consider the estimation of regression coefficients in a high-dimensional linear model. For regression coefficients in lr balls, we provide lower bounds for the minimax lq risk and minimax quantiles of the lq loss for all design matrices. Under an l0 sparsity condition on a target coefficient vector, we sharpen and unify existing oracle inequalities for the Lasso and Dantzig selector. We derive oracle inequalities for target coefficient vectors with many small elements and smaller threshold levels than the universal threshold. These oracle inequalities provide sufficient conditions on the design matrix for the rate minimaxity of the Lasso and Dantzig selector for the lq risk and loss in lr balls, 0≤ r≤ 1≤ q≤ ∞. By allowing q=∞, our risk bounds imply the variable selection consistency of threshold Lasso and Dantzig selectors.