Sparse Approximate Solutions to Linear Systems
SIAM Journal on Computing
Machine Learning - Special issue on inductive transfer
Multi-task feature and kernel selection for SVMs
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Constructing informative priors using transfer learning
ICML '06 Proceedings of the 23rd international conference on Machine learning
A Framework for Learning Predictive Structures from Multiple Tasks and Unlabeled Data
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Learning a meta-level prior for feature relevance from multiple related tasks
Proceedings of the 24th international conference on Machine learning
Convex multi-task feature learning
Machine Learning
Joint covariate selection and joint subspace selection for multiple classification problems
Statistics and Computing
Minimum Description Length Penalization for Group and Multi-Task Sparse Learning
The Journal of Machine Learning Research
Efficient online learning for multitask feature selection
ACM Transactions on Knowledge Discovery from Data (TKDD)
Hi-index | 0.01 |
We address the problem of joint feature selection in multiple related classification or regression tasks. When doing feature selection with multiple tasks, usually one can "borrow strength" across these tasks to get a more sensitive criterion for deciding which features to select. We propose a novel method, the Multiple Inclusion Criterion (MIC), which modifies stepwise feature selection to more easily select features that are helpful across multiple tasks. Our approach allows each feature to be added to none, some, or all of the tasks. MIC is most beneficial for selecting a small set of predictive features from a large pool of potential features, as is common in genomic and biological datasets. Experimental results on such datasets show that MIC usually outperforms other competing multi-task learning methods not only in terms of accuracy but also by building simpler and more interpretable models.