A Bayesian/Information Theoretic Model of Learning to Learn viaMultiple Task Sampling
Machine Learning - Special issue on inductive transfer
Machine Learning - Special issue on inductive transfer
Learning Logical Definitions from Relations
Machine Learning
Feature Construction with Version Spaces for Biochemical Applications
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Top-Down Induction of Clustering Trees
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Repeat Learning Using Predicate Invention
ILP '98 Proceedings of the 8th International Workshop on Inductive Logic Programming
Experiments in Predicting Biodegradability
ILP '99 Proceedings of the 9th International Workshop on Inductive Logic Programming
Task clustering and gating for bayesian multitask learning
The Journal of Machine Learning Research
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
Multi-Task Learning for Classification with Dirichlet Process Priors
The Journal of Machine Learning Research
Multi-class Prediction Using Stochastic Logic Programs
Inductive Logic Programming
Learning systems of concepts with an infinite relational model
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
kFOIL: learning simple relational kernels
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
nFOIL: integrating Naïve Bayes and FOIL
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
Fast learning of relational kernels
Machine Learning
Protein fold discovery using stochastic logic programs
Probabilistic inductive logic programming
Bayesian multitask learning with latent hierarchies
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Finding relational associations in HIV resistance mutation data
ILP'09 Proceedings of the 19th international conference on Inductive logic programming
Hi-index | 0.00 |
We introduce hierarchical kFOIL as a simple extension of the multitask kFOIL learning algorithm. The algorithm first learns a core logic representation common to all tasks, and then refines it by specialization on a per-task basis. The approach can be easily generalized to a deeper hierarchy of tasks. A task clustering algorithm is also proposed in order to automatically generate the task hierarchy. The approach is validated on problems of drug-resistance mutation prediction and protein structural classification. Experimental results show the advantage of the hierarchical version over both single and multi task alternatives and its potential usefulness in providing explanatory features for the domain. Task clustering allows to further improve performance when a deeper hierarchy is considered.