Conceptual structures: information processing in mind and machine
Conceptual structures: information processing in mind and machine
Efficient top-down induction of logic programs
ACM SIGART Bulletin
Stochastic Complexity in Statistical Inquiry Theory
Stochastic Complexity in Statistical Inquiry Theory
Semantic Networks in Artificial Intelligence
Semantic Networks in Artificial Intelligence
IEEE Intelligent Systems
Empirical and theoretical analysis of relational concept learning using a graph-based representation
Empirical and theoretical analysis of relational concept learning using a graph-based representation
Substructure discovery using minimum description length and background knowledge
Journal of Artificial Intelligence Research
Computing graph-based lattices from smallest projections
KONT'07/KPP'07 Proceedings of the First international conference on Knowledge processing and data analysis
Learning closed sets of labeled graphs for chemical applications
ILP'05 Proceedings of the 15th international conference on Inductive Logic Programming
Hi-index | 0.00 |
We compare our graph-based relational concept learning approach "SubdueCL" with the ILP systems FOIL and Progol. In order to be fair in the comparison, we use the conceptual graphs representation. Conceptual graphs have a standard translation from graphs into logic. In this way, we introduce less bias during the translation process. We experiment with different types of domains. First, we show our experiments with an artificial domain to describe how SubdueCL performs with the conceptual graphs representation. Second, we experiment with several flat and relational domains. The results of the comparison show that the SubdueCL system is competitive with ILP systems in both flat and relational domains.