Flattening and Saturation: Two Representation Changes for Generalization
Machine Learning - Special issue on evaluating and changing representation
First-order jk-clausal theories are PAC-learnable
Artificial Intelligence
Solving the multiple instance problem with axis-parallel rectangles
Artificial Intelligence
Attribute-Value Learning Versus Inductive Logic Programming: The Missing Links (Extended Abstract)
ILP '98 Proceedings of the 8th International Workshop on Inductive Logic Programming
Strongly Typed Inductive Concept Learning
ILP '98 Proceedings of the 8th International Workshop on Inductive Logic Programming
Hi-index | 0.00 |
Traditionally, inductive learning algorithms such as decision tree learners have employed attribute-value representations, which are essentially propositional. While learning in first-order logic has been studied for almost 20 years, this has mostly resulted in completely new learning algorithms rather than first-order upgrades of propositional learning algorithms. To re-establish the link between propositional and first-order learning, we have to focus on individual-centered representations. This short paper is devoted to the nature of first-order individual-centered representations for inductive learning. I discuss three possible perspectives: representing individuals as Herbrand interpretations, representing datasets as an individual-centered database, and representing individuals as terms.