The uncertain reasoner's companion: a mathematical perspective
The uncertain reasoner's companion: a mathematical perspective
Representation and extraction of information by probabilistic logic
Information Systems
Term rewriting and all that
Reasoning about Uncertainty
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
First-order probabilistic inference
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Conditionals in nonmonotonic reasoning and belief revision: considering conditionals as agents
Conditionals in nonmonotonic reasoning and belief revision: considering conditionals as agents
Relational probabilistic conditional reasoning at maximum entropy
ECSQARU'11 Proceedings of the 11th European conference on Symbolic and quantitative approaches to reasoning with uncertainty
On lifted inference for a relational probabilistic conditional logic with maximum entropy semantics
FoIKS'12 Proceedings of the 7th international conference on Foundations of Information and Knowledge Systems
Hi-index | 0.00 |
A major challenge in knowledge representation is to express uncertain knowledge. One possibility is to combine logic and probability. In this paper, we investigate the logic FO-PCL that uses first-order probabilistic conditionals to formulate uncertain knowledge. Reasoning in FO-PCL employs the principle of maximum entropy which in this context refers to the set of all ground instances of the conditionals in a knowledge base R. We formalize the syntactic criterion of FO-PCL interactions in R prohibiting the maximum entropy model computation on the level of conditionals instead of their instances. A set of rules is developed transforming R into an equivalent knowledge base R′ without FO-PCL interactions.