Efficient Approximations for the MarginalLikelihood of Bayesian Networks with Hidden Variables
Machine Learning - Special issue on learning with probabilistic representations
Learning Bayesian networks from data: an information-theory based approach
Artificial Intelligence
Bayesian Networks for Data Mining
Data Mining and Knowledge Discovery
PAKDD '01 Proceedings of the 5th Pacific-Asia Conference on Knowledge Discovery and Data Mining
Learning Bayesian Networks from Incomplete Data Based on EMI Method
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Learning Bayesian networks from incomplete data with stochastic search algorithms
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
The Bayesian structural EM algorithm
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Robust independence testing for constraint-based learning of causal structure
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Review: learning bayesian networks: Approaches and issues
The Knowledge Engineering Review
Hi-index | 0.00 |
At present, most of the algorithms for learning Bayesian Networks (BNs) use EM algorithm to deal with incomplete data. They are of low efficiency because EM algorithm has to perform iterative process of probability reasoning to complete the incomplete data. In this paper we present an efficient BN learning algorithm, which use the combination of EMI method and a scoring function based on mutual information theory. The algorithm first uses EMI method to estimate, from incomplete data, probability distributions over local structures of BNs, then evaluates BN structures with the scoring function and searches for the best one. The detailed procedure of the algorithm is depicted in the paper. The experimental results on Asia and Alarm networks show that when achieving high accuracy, the algorithm is much more efficient than two EM based algorithms, SEM and EM-EA algorithms.