The Group-Lasso for generalized linear models: uniqueness of solutions and efficient algorithms
Proceedings of the 25th international conference on Machine learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
An efficient projection for l1, ∞ regularization
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Generalized spike-and-slab priors for Bayesian group feature selection using expectation propagation
The Journal of Machine Learning Research
Hi-index | 0.00 |
The l1,∞ norm and the l1,2 norm are well known tools for joint regularization in Group-Lasso methods. While the l1,2 version has been studied in detail, there are still open questions regarding the uniqueness of solutions and the efficiency of algorithms for the l1,∞ variant. For the latter, we characterize the conditions for uniqueness of solutions, we present a simple test for uniqueness, and we derive a highly efficient active set algorithm that can deal with input dimensions in the millions. We compare both variants of the Group-Lasso for the two most common application scenarios of the Group-Lasso, one is to obtain sparsity on the level of groups in "standard" prediction problems, the second one is multi-task learning where the aim is to solve many learning problems in parallel which are coupled via the Group-Lasso constraint. We show that both version perform quite similar in "standard" applications. However, a very clear distinction between the variants occurs in multi-task settings where the l1,2 version consistently outperforms the l1,∞ counterpart in terms of prediction accuracy.