Convergence of a block coordinate descent method for nondifferentiable minimization
Journal of Optimization Theory and Applications
Recovery Algorithms for Vector-Valued Data with Joint Sparsity Constraints
SIAM Journal on Numerical Analysis
Convex multi-task feature learning
Machine Learning
The group-lasso: l1,∞regularization versus l1,2regularization
Proceedings of the 32nd DAGM conference on Pattern recognition
Discovering sociolinguistic associations with structured sparsity
HLT '11 Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies - Volume 1
Union Support Recovery in Multi-task Learning
The Journal of Machine Learning Research
Fast Projections onto l1,q-norm balls for grouped feature selection
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part III
Multiclass sparse Bayesian regression for fMRI-based prediction
Journal of Biomedical Imaging - Special issue on Machine Learning in Medical Imaging
Content based social behavior prediction: a multi-task learning approach
Proceedings of the 20th ACM international conference on Information and knowledge management
Towards multi-semantic image annotation with graph regularized exclusive group lasso
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Optimization with Sparsity-Inducing Penalties
Foundations and Trends® in Machine Learning
Sparse methods for biomedical data
ACM SIGKDD Explorations Newsletter
Time-sensitive web image ranking and retrieval via dynamic multi-task regression
Proceedings of the sixth ACM international conference on Web search and data mining
AI'12 Proceedings of the 25th Australasian joint conference on Advances in Artificial Intelligence
Efficient online learning for multitask feature selection
ACM Transactions on Knowledge Discovery from Data (TKDD)
Active learning via neighborhood reconstruction
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Multi-label learning under feature extraction budgets
Pattern Recognition Letters
Group subset selection for linear regression
Computational Statistics & Data Analysis
Hi-index | 0.00 |
We develop a cyclical blockwise coordinate descent algorithm for the multi-task Lasso that efficiently solves problems with thousands of features and tasks. The main result shows that a closed-form Winsorization operator can be obtained for the sup-norm penalized least squares regression. This allows the algorithm to find solutions to very large-scale problems far more efficiently than existing methods. This result complements the pioneering work of Friedman, et al. (2007) for the single-task Lasso. As a case study, we use the multi-task Lasso as a variable selector to discover a semantic basis for predicting human neural activation. The learned solution outperforms the standard basis for this task on the majority of test participants, while requiring far fewer assumptions about cognitive neuroscience. We demonstrate how this learned basis can yield insights into how the brain represents the meanings of words.