Bayesian Networks and Decision Graphs
Bayesian Networks and Decision Graphs
Machine Learning
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Concurrency, Graphs and Models
ICLP '08 Proceedings of the 24th International Conference on Logic Programming
Exploiting causal independence in Bayesian network inference
Journal of Artificial Intelligence Research
Parameter learning of logic programs for symbolic-statistical modeling
Journal of Artificial Intelligence Research
ProbLog: a probabilistic prolog and its application in link discovery
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Demand-driven indexing of prolog clauses
ICLP'07 Proceedings of the 23rd international conference on Logic programming
Probabilistic inductive logic programming
Probabilistic inductive logic programming
The independent choice logic and beyond
Probabilistic inductive logic programming
CLP(BN): constraint logic programming for probabilistic knowledge
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Hi-index | 0.00 |
The last few years have seen great interest in developing models that can describe real-life large-scale structured systems. A popular approach is to address these problems by using logic to describe the patterns or structure of the problems, and by using a calculus of probabilities to address the uncertainty so often found in real life situations. The CLP($\cal BN$) language is an extension of Prolog that allows the representation, inference, and learning of bayesian networks. The language was inspired on Koller’s Probabilistic Relational Models, and is close to other probabilistic relational languages based in Prolog, such as Sato’s PRISM. We present the implementation of CLP($\cal BN$), showing how bayesian networks are represented in CLP($\cal BN$) and presenting the implementation of three different inference algorithms: Gibbs Sampling, Variable Elimination, and Junction Trees. We show that these algorithms can be implemented effectively by using a matrix library and a graph manipulation library, and study how the system performs on real-life applications.