Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
On the Adaptive Selection of the Parameter in Regularization of Ill-Posed Problems
SIAM Journal on Numerical Analysis
Wavelet kernel penalized estimation for non-equispaced design regression
Statistics and Computing
On Learning Vector-Valued Functions
Neural Computation
On regularization algorithms in learning theory
Journal of Complexity
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
Optimal Rates for the Regularized Least-Squares Algorithm
Foundations of Computational Mathematics
Recovery Algorithms for Vector-Valued Data with Joint Sparsity Constraints
SIAM Journal on Numerical Analysis
The Journal of Machine Learning Research
A sparsity-enforcing method for learning face features
IEEE Transactions on Image Processing
A regularized approach to feature selection for face detection
ACCV'07 Proceedings of the 8th Asian conference on Computer vision - Volume Part II
Weight-decay regularization in reproducing Kernel Hilbert spaces by variable-basis schemes
WSEAS Transactions on Mathematics
A consistent algorithm to solve Lasso, elastic-net and Tikhonov regularization
Journal of Complexity
The consistency analysis of coefficient regularized classification with convex loss
WSEAS Transactions on Mathematics
Structured sparsity and generalization
The Journal of Machine Learning Research
Bound the learning rates with generalized gradients
WSEAS Transactions on Signal Processing
Concentration estimates for learning with unbounded sampling
Advances in Computational Mathematics
Statistical analysis of the moving least-squares method with unbounded sampling
Information Sciences: an International Journal
Hi-index | 0.00 |
Within the framework of statistical learning theory we analyze in detail the so-called elastic-net regularization scheme proposed by Zou and Hastie [H. Zou, T. Hastie, Regularization and variable selection via the elastic net, J. R. Stat. Soc. Ser. B, 67(2) (2005) 301-320] for the selection of groups of correlated variables. To investigate the statistical properties of this scheme and in particular its consistency properties, we set up a suitable mathematical framework. Our setting is random-design regression where we allow the response variable to be vector-valued and we consider prediction functions which are linear combinations of elements (features) in an infinite-dimensional dictionary. Under the assumption that the regression function admits a sparse representation on the dictionary, we prove that there exists a particular ''elastic-net representation'' of the regression function such that, if the number of data increases, the elastic-net estimator is consistent not only for prediction but also for variable/feature selection. Our results include finite-sample bounds and an adaptive scheme to select the regularization parameter. Moreover, using convex analysis tools, we derive an iterative thresholding algorithm for computing the elastic-net solution which is different from the optimization procedure originally proposed in the above-cited work.