Shared binary decision diagram with attributed edges for efficient Boolean function manipulation
DAC '90 Proceedings of the 27th ACM/IEEE Design Automation Conference
A differential approach to inference in Bayesian networks
Journal of the ACM (JACM)
On probabilistic inference by weighted model counting
Artificial Intelligence
Compiling Bayesian networks by symbolic probability calculation based on zero-suppressed BDDs
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Compiling Bayesian networks with local structure
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Context-specific independence in Bayesian networks
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
Compiling Bayesian networks (BNs) is one of the most effective ways to exact inference because a logical approach enables the exploitation of local structures in BNs (i.e., determinism and context-specific independence). In this paper, a new parameter learning method based on compiling BNs is proposed. Firstly, a target BN with multiple evidence sets are compiled into a single shared binary decision diagram (SBDD) which shares common sub-graphs in multiple BDDs. Secondly, all conditional expectations which are required for executing the EM algorithm are simultaneously computed on the SBDD while their common local probabilities and expectations are shared. Due to these two types of sharing, the computation efficiency of the proposed method is higher than that of an EMalgorithm which naively uses an existing BN compiler for exact inference.