The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning - Special issue on inductive transfer
Task clustering and gating for bayesian multitask learning
The Journal of Machine Learning Research
Convex Optimization
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Multi-task feature and kernel selection for SVMs
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Data Mining: Concepts and Techniques
Data Mining: Concepts and Techniques
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
A Framework for Learning Predictive Structures from Multiple Tasks and Unlabeled Data
The Journal of Machine Learning Research
Pegasos: Primal Estimated sub-GrAdient SOlver for SVM
Proceedings of the 24th international conference on Machine learning
Logarithmic regret algorithms for online convex optimization
Machine Learning
A primal-dual perspective of online learning algorithms
Machine Learning
Algorithms for Sparse Linear Classifiers in the Massive Data Setting
The Journal of Machine Learning Research
Convex multi-task feature learning
Machine Learning
Primal-dual subgradient methods for convex problems
Mathematical Programming: Series A and B - Series B - Special Issue: Nonsmooth Optimization and Applications
A convex formulation for learning shared structures from multiple tasks
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
An efficient projection for l1, ∞ regularization
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Large-scale sparse logistic regression
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Sparse Online Learning via Truncated Gradient
The Journal of Machine Learning Research
Multi-task Feature Selection Using the Multiple Inclusion Criterion (MIC)
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part I
Multi-task learning for learning to rank in web search
Proceedings of the 18th ACM conference on Information and knowledge management
Accelerated Gradient Method for Multi-task Sparse Learning Problem
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
Joint covariate selection and joint subspace selection for multiple classification problems
Statistics and Computing
Efficient Online and Batch Learning Using Forward Backward Splitting
The Journal of Machine Learning Research
Multi-task feature learning via efficient l2, 1-norm minimization
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Learning incoherent sparse and low-rank patterns from multiple tasks
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Online learning for multi-task feature selection
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
Dual Averaging Methods for Regularized Stochastic Learning and Online Optimization
The Journal of Machine Learning Research
Multitask Sparsity via Maximum Entropy Discrimination
The Journal of Machine Learning Research
Minimum Description Length Penalization for Group and Multi-Task Sparse Learning
The Journal of Machine Learning Research
Double Updating Online Learning
The Journal of Machine Learning Research
Trace Norm Regularization: Reformulations, Algorithms, and Multi-Task Learning
SIAM Journal on Optimization
Learning Incoherent Sparse and Low-Rank Patterns from Multiple Tasks
ACM Transactions on Knowledge Discovery from Data (TKDD)
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Efficient Sparse Generalized Multiple Kernel Learning
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Learning explanatory features across multiple related tasks, or MultiTask Feature Selection (MTFS), is an important problem in the applications of data mining, machine learning, and bioinformatics. Previous MTFS methods fulfill this task by batch-mode training. This makes them inefficient when data come sequentially or when the number of training data is so large that they cannot be loaded into the memory simultaneously. In order to tackle these problems, we propose a novel online learning framework to solve the MTFS problem. A main advantage of the online algorithm is its efficiency in both time complexity and memory cost. The weights of the MTFS models at each iteration can be updated by closed-form solutions based on the average of previous subgradients. This yields the worst-case bounds of the time complexity and memory cost at each iteration, both in the order of O(d × Q), where d is the number of feature dimensions and Q is the number of tasks. Moreover, we provide theoretical analysis for the average regret of the online learning algorithms, which also guarantees the convergence rate of the algorithms. Finally, we conduct detailed experiments to show the characteristics and merits of the online learning algorithms in solving several MTFS problems.