Information Processing Letters
The Strength of Weak Learnability
Machine Learning
On the necessity of Occam algorithms
STOC '90 Proceedings of the twenty-second annual ACM symposium on Theory of computing
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
Machine Learning - Special issue on inductive transfer
Information Processing Letters
SODA '03 Proceedings of the fourteenth annual ACM-SIAM symposium on Discrete algorithms
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
A model of inductive bias learning
Journal of Artificial Intelligence Research
On Universal Transfer Learning
ALT '07 Proceedings of the 18th international conference on Algorithmic Learning Theory
On universal transfer learning
Theoretical Computer Science
A compression-based dissimilarity measure for multi-task clustering
ISMIS'11 Proceedings of the 19th international conference on Foundations of intelligent systems
Hi-index | 0.00 |
We show that it is possible to use data compression on independently obtained hypotheses from various tasks to algorithmically provide guarantees that the tasks are sufficiently related to benefit from multitask learning. We give uniform bounds in terms of the empirical average error for the true average error of the n hypotheses provided by deterministic learning algorithms drawing independent samples from a set of n unknown computable task distributions over finite sets.