Quantitative stability of variational systems III: &egr;-approximate solutions
Mathematical Programming: Series A and B
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
A Nonlinear Primal-Dual Method for Total Variation-Based Image Restoration
SIAM Journal on Scientific Computing
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning from Examples as an Inverse Problem
The Journal of Machine Learning Research
Consistency of the Group Lasso and Multiple Kernel Learning
The Journal of Machine Learning Research
Derivative reproducing properties for kernel methods in learning theory
Journal of Computational and Applied Mathematics
Fast Optimization Methods for L1 Regularization: A Comparative Study and Two New Approaches
ECML '07 Proceedings of the 18th European conference on Machine Learning
Towards a theoretical foundation for Laplacian-based manifold methods
Journal of Computer and System Sciences
Support Vector Machines
Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
SIAM Journal on Optimization
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
SIAM Journal on Imaging Sciences
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
SIAM Journal on Imaging Sciences
Efficient Online and Batch Learning Using Forward Backward Splitting
The Journal of Machine Learning Research
Approximation accuracy, gradient methods, and error bound for structured convex optimization
Mathematical Programming: Series A and B - 20th International Symposium on Mathematical Programming – ISMP 2009
Solving structured sparsity regularization with proximal methods
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part II
Statistics for High-Dimensional Data: Methods, Theory and Applications
Statistics for High-Dimensional Data: Methods, Theory and Applications
NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
SIAM Journal on Imaging Sciences
Structured sparsity and generalization
The Journal of Machine Learning Research
IEEE Transactions on Information Theory
Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
IEEE Transactions on Information Theory
Hi-index | 0.00 |
In this work we are interested in the problems of supervised learning and variable selection when the input-output dependence is described by a nonlinear function depending on a few variables. Our goal is to consider a sparse nonparametric model, hence avoiding linear or additive models. The key idea is to measure the importance of each variable in the model by making use of partial derivatives. Based on this intuition we propose a new notion of nonparametric sparsity and a corresponding least squares regularization scheme. Using concepts and results from the theory of reproducing kernel Hilbert spaces and proximal methods, we show that the proposed learning algorithm corresponds to a minimization problem which can be provably solved by an iterative procedure. The consistency properties of the obtained estimator are studied both in terms of prediction and selection performance. An extensive empirical analysis shows that the proposed method performs favorably with respect to the state-of-the-art methods.