Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Generalization error bounds for Bayesian mixture algorithms
The Journal of Machine Learning Research
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Feature space perspectives for learning the kernel
Machine Learning
Elastic-net regularization in learning theory
Journal of Complexity
Learning with structured sparsity
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Group lasso with overlap and graph lasso
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Model-based compressive sensing
IEEE Transactions on Information Theory
lp-Norm Multiple Kernel Learning
The Journal of Machine Learning Research
Structured Variable Selection with Sparsity-Inducing Norms
The Journal of Machine Learning Research
Regularizers for structured sparsity
Advances in Computational Mathematics
Nonparametric sparsity and regularization
The Journal of Machine Learning Research
Hi-index | 0.00 |
We present a data dependent generalization bound for a large class of regularized algorithms which implement structured sparsity constraints. The bound can be applied to standard squared-norm regularization, the Lasso, the group Lasso, some versions of the group Lasso with overlapping groups, multiple kernel learning and other regularization schemes. In all these cases competitive results are obtained. A novel feature of our bound is that it can be applied in an infinite dimensional setting such as the Lasso in a separable Hilbert space or multiple kernel learning with a countable number of kernels.