Theoretical models of learning to learn
Learning to learn
Learning to learn
Learning to learn
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
A Framework for Learning Predictive Structures from Multiple Tasks and Unlabeled Data
The Journal of Machine Learning Research
Bounds for Linear Multi-Task Learning
The Journal of Machine Learning Research
A model of inductive bias learning
Journal of Artificial Intelligence Research
When Is There a Representer Theorem? Vector Versus Matrix Regularizers
The Journal of Machine Learning Research
Frustratingly easy semi-supervised domain adaptation
DANLP 2010 Proceedings of the 2010 Workshop on Domain Adaptation for Natural Language Processing
Hi-index | 0.00 |
Bounds are given for the empirical and expected Rademacher complexity of classes of linear transformations from a Hilbert space H to a finite dimensional space. The results imply generalization guarantees for graph regularization and multi-task subspace learning.