Machine Learning - Special issue on inductive transfer
Hierarchical Mixtures of Experts and the EM Algorithm
Hierarchical Mixtures of Experts and the EM Algorithm
Uniform object generation for optimizing one-class classifiers
The Journal of Machine Learning Research
Adaptive mixtures of local experts
Neural Computation
Hi-index | 0.00 |
A recent variant of multi-task learning uses the other tasks to help in learning a task-of-interest, for which there is too little training data. The task can be classification, prediction, or density estimation. The problem is that only some of the data of the other tasks are relevant or representative for the task-of-interest. It has been experimentally demonstrated that a generative model works well in this relevant subtask learning task. In this paper we analyze the generalization error of the model, to show that it is smaller than in standard alternatives, and to point out connections to semi-supervised learning, multi-task learning, and active learning or covariate shift.