A finite algorithm for finding the projection of a point onto the Canonical simplex of Rn
Journal of Optimization Theory and Applications
Nonmonotone Spectral Projected Gradient Methods on Convex Sets
SIAM Journal on Optimization
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
Numerische Mathematik
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
Algorithms for simultaneous sparse approximation: part II: Convex relaxation
Signal Processing - Sparse approximations in signal and image processing
Consistency of the Group Lasso and Multiple Kernel Learning
The Journal of Machine Learning Research
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Efficient Euclidean projections in linear time
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
An efficient projection for l1, ∞ regularization
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Efficient Online and Batch Learning Using Forward Backward Splitting
The Journal of Machine Learning Research
De-noising by soft-thresholding
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Joint sparsity is widely acknowledged as a powerful structural cue for performing feature selection in setups where variables are expected to demonstrate "grouped" behavior. Such grouped behavior is commonly modeled by Group-Lasso or Multitask Lasso-type problems, where feature selection is effected via l1,q -mixed-norms. Several particular formulations for modeling groupwise sparsity have received substantial attention in the literature; and in some cases, efficient algorithms are also available. Surprisingly, for constrained formulations of fundamental importance (e.g., regression with an l1,∞-norm constraint), highly scalable methods seem to be missing. We address this deficiency by presenting a method based on spectral projected-gradient (SPG) that can tackle l1,q -constrained convex regression problems. The most crucial component of our method is an algorithm for projecting onto l1,q-norm balls. We present several numerical results which show that our methods attain up to 30X speedups on large l1,∞-multitask lasso problems. Even more dramatic are the gains for just the l1,∞-projection subproblem: we observe almost three orders of magnitude speedups compared against the currently standard method.