Three companions for data mining in first order logic
Relational Data Mining
Combining probabilistic logic programming with the power of maximum entropy
Artificial Intelligence - Special issue on nonmonotonic reasoning
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Conditionals in nonmonotonic reasoning and belief revision: considering conditionals as agents
Conditionals in nonmonotonic reasoning and belief revision: considering conditionals as agents
Data Mining: Concepts and Techniques
Data Mining: Concepts and Techniques
Machine Learning
Algebraic knowledge discovery using haskell
PADL'07 Proceedings of the 9th international conference on Practical Aspects of Declarative Languages
On the problem of reversing relational inductive knowledge representation
ECSQARU'13 Proceedings of the 12th European conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Hi-index | 0.00 |
The principle of maximum entropy inductively completes the knowledge given by a knowledge base $\mathcal R$, and it has been suggested to view learning as an operation being inverse to inductive knowledge completion. While a corresponding learning approach has been developed when $\mathcal R$ is based on propositional logic, in this paper we describe an extension to a relational setting. It allows to learn relational FO-PCL knowledge bases containing both generic conditionals as well as specific conditionals referring to exceptional individuals from a given probability distribution.