On the influence of the kernel on the consistency of support vector machines
The Journal of Machine Learning Research
Convex Optimization
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
The Journal of Machine Learning Research
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
A statistical framework for genomic data fusion
Bioinformatics
Kernel Methods for Measuring Independence
The Journal of Machine Learning Research
Multi-kernel regularized classifiers
Journal of Complexity
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
Statistical Consistency of Kernel Canonical Correlation Analysis
The Journal of Machine Learning Research
More efficiency in multiple kernel learning
Proceedings of the 24th international conference on Machine learning
Spectral clustering and transductive learning with multiple views
Proceedings of the 24th international conference on Machine learning
Bolasso: model consistent Lasso estimation through the bootstrap
Proceedings of the 25th international conference on Machine learning
Consistency of Trace Norm Minimization
The Journal of Machine Learning Research
Bolasso: model consistent Lasso estimation through the bootstrap
Proceedings of the 25th international conference on Machine learning
Consistency of Trace Norm Minimization
The Journal of Machine Learning Research
Learning with structured sparsity
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Group lasso with overlap and graph lasso
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Discriminative semi-supervised feature selection via manifold regularization
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Online Learning for Matrix Factorization and Sparse Coding
The Journal of Machine Learning Research
Block-sparse signals: uncertainty relations and efficient recovery
IEEE Transactions on Signal Processing
Estimation of a Structural Vector Autoregression Model Using Non-Gaussianity
The Journal of Machine Learning Research
Regularizing multiple kernel learning using response surface methodology
Pattern Recognition
Rademacher chaos complexities for learning the kernel problem
Neural Computation
S3MKL: scalable semi-supervised multiple kernel learning for image data mining
Proceedings of the international conference on Multimedia
Super-resolution with sparse mixing estimators
IEEE Transactions on Image Processing
Multitask Sparsity via Maximum Entropy Discrimination
The Journal of Machine Learning Research
Learning Multi-modal Similarity
The Journal of Machine Learning Research
Minimum Description Length Penalization for Group and Multi-Task Sparse Learning
The Journal of Machine Learning Research
Multiple Kernel Learning Algorithms
The Journal of Machine Learning Research
Common substructure learning of multiple graphical Gaussian models
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part II
Fast Projections onto l1,q-norm balls for grouped feature selection
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part III
Structured Variable Selection with Sparsity-Inducing Norms
The Journal of Machine Learning Research
Group Lasso Estimation of High-dimensional Covariance Matrices
The Journal of Machine Learning Research
Learning with Structured Sparsity
The Journal of Machine Learning Research
Robust classification using l2,1-norm based regression model
Pattern Recognition
Optimization with Sparsity-Inducing Penalties
Foundations and Trends® in Machine Learning
Minimax-optimal rates for sparse additive models over kernel classes via convex programming
The Journal of Machine Learning Research
Probabilistic and discriminative group-wise feature selection methods for credit risk analysis
Expert Systems with Applications: An International Journal
A Bayesian approach to sparse dynamic network identification
Automatica (Journal of IFAC)
Structured sparsity via alternating direction methods
The Journal of Machine Learning Research
Regularization techniques for learning with matrices
The Journal of Machine Learning Research
Learning the coordinate gradients
Advances in Computational Mathematics
Annotating web images using NOVA: NOn-conVex group spArsity
Proceedings of the 20th ACM international conference on Multimedia
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part II
Sparse methods for biomedical data
ACM SIGKDD Explorations Newsletter
Online learning with multiple kernels: A review
Neural Computation
lp-norm multikernel learning approach for stock market price forecasting
Computational Intelligence and Neuroscience
On the convergence rate of lp-norm multiple kernel learning
The Journal of Machine Learning Research
Active learning via neighborhood reconstruction
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Nonparametric sparsity and regularization
The Journal of Machine Learning Research
Generalized spike-and-slab priors for Bayesian group feature selection using expectation propagation
The Journal of Machine Learning Research
Journal of Global Optimization
Analytic center cutting plane method for multiple kernel learning
Annals of Mathematics and Artificial Intelligence
Hi-index | 0.01 |
We consider the least-square regression problem with regularization by a block l1-norm, that is, a sum of Euclidean norms over spaces of dimensions larger than one. This problem, referred to as the group Lasso, extends the usual regularization by the l1-norm where all spaces have dimension one, where it is commonly referred to as the Lasso. In this paper, we study the asymptotic group selection consistency of the group Lasso. We derive necessary and sufficient conditions for the consistency of group Lasso under practical assumptions, such as model mis specification. When the linear predictors and Euclidean norms are replaced by functions and reproducing kernel Hilbert norms, the problem is usually referred to as multiple kernel learning and is commonly used for learning from heterogeneous data sources and for non linear variable selection. Using tools from functional analysis, and in particular covar iance operators, we extend the consistency results to this infinite dimensional case and also propose an adaptive scheme to obtain a consistent model estimate, even when the necessary condition required for the non adaptive scheme is not satisfied.