Adaptive Probabilistic Networks with Hidden Variables
Machine Learning - Special issue on learning with probabilistic representations
Learning Bayesian networks from data: an information-theory based approach
Artificial Intelligence
UAI '89 Proceedings of the Fifth Annual Conference on Uncertainty in Artificial Intelligence
Time and sample efficient discovery of Markov blankets and direct causal relations
Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
Exact Bayesian Structure Discovery in Bayesian Networks
The Journal of Machine Learning Research
Large-Sample Learning of Bayesian Networks is NP-Hard
The Journal of Machine Learning Research
A Linear Non-Gaussian Acyclic Model for Causal Discovery
The Journal of Machine Learning Research
Temporal causal modeling with graphical granger methods
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Proceedings of the 25th international conference on Machine learning
Using Markov Blankets for Causal Structure Learning
The Journal of Machine Learning Research
Distribution-free learning of Bayesian network structure in continuous domains
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
The Journal of Machine Learning Research
A heuristic partial-correlation-based algorithm for causal relationship discovery on continuous data
IDEAL'09 Proceedings of the 10th international conference on Intelligent data engineering and automated learning
Exact structure discovery in Bayesian networks with less space
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
An efficient causal discovery algorithm for linear models
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
A Bayesian multiresolution independence test for continuous variables
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
MIDAS - an influence diagram for management of mildew in winter wheat
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
Dependency analysis is a typical approach for Bayesian network learning, which infers the structures of Bayesian networks by the results of a series of conditional independence (CI) tests. In practice, testing independence conditioning on large sets hampers the performance of dependency analysis algorithms in terms of accuracy and running time for the following reasons. First, testing independence on large sets of variables with limited samples is not stable. Second, for most dependency analysis algorithms, the number of CI tests grows at an exponential rate with the sizes of conditioning sets, and the running time grows of the same rate. Therefore, determining how to reduce the number of CI tests and the sizes of conditioning sets becomes a critical step in dependency analysis algorithms. In this article, we address a two-phase algorithm based on the observation that the structures of Markov random fields are similar to those of Bayesian networks. The first phase of the algorithm constructs a Markov random field from data, which provides a close approximation to the structure of the true Bayesian network; the second phase of the algorithm removes redundant edges according to CI tests to get the true Bayesian network. Both phases use Markov blanket information to reduce the sizes of conditioning sets and the number of CI tests without sacrificing accuracy. An empirical study shows that the two-phase algorithm performs well in terms of accuracy and efficiency.