Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Non-negative Matrix Factorization with Sparseness Constraints
The Journal of Machine Learning Research
Regularized Least Absolute Deviations Regression and an Efficient Algorithm for Parameter Tuning
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
On recovery of sparse signals via l1 minimization
IEEE Transactions on Information Theory
Robust recovery of signals from a structured union of subspaces
IEEE Transactions on Information Theory
Decoding by linear programming
IEEE Transactions on Information Theory
Stable recovery of sparse overcomplete representations in the presence of noise
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Just relax: convex programming methods for identifying sparse signals in noise
IEEE Transactions on Information Theory
The L1 penalized LAD estimator for high dimensional linear regression
Journal of Multivariate Analysis
Hi-index | 754.84 |
In the context of linear regression, the least absolute shrinkage and selection operator (LASSO) is probably the most popular supervised-learning technique proposed to recover sparse signals from high-dimensional measurements. Prior literature has mainly concerned itself with independent, identically distributed noise with moderate variance. In many real applications, however, the measurement errors may have heavy-tailed distributions or suffer from severe outliers, making the LASSO poorly estimate the coefficients due to its sensitivity to large error variance. To address this concern, a robust version of the LASSO is proposed, and the limiting distribution of its estimator is derived. Model selection consistency is established for the proposed robust LASSO under an adaptation procedure of the penalty weight. A parallel asymptotic analysis is derived for the Huberized LASSO, a previously proposed robust LASSO, and it is shown that the Huberized LASSO estimator preserves similar asymptotics even with a Cauchy error distribution. We show that asymptotic variances of the two robust LASSO estimators are stabilized in the presence of large variance noise, compared with the unbounded asymptotic variance of the ordinary LASSO estimator. The asymptotic analysis from the nonstochastic design is extended to the case of random design. Simulations further confirm our theoretical results.