Efficient algorithms for combinatorial problems on graphs with bounded, decomposability—a survey
BIT - Ellis Horwood series in artificial intelligence
Complexity of finding embeddings in a k-tree
SIAM Journal on Algebraic and Discrete Methods
Network-based heuristics for constraint-satisfaction problems
Artificial Intelligence
A valuation-based language for expert systems
International Journal of Approximate Reasoning
Probabilistic inference in multiply connected belief networks using loop cutsets
International Journal of Approximate Reasoning
Finding MAPs for belief networks is NP-hard
Artificial Intelligence
On the hardness of approximate reasoning
Artificial Intelligence
Local conditioning in Bayesian networks
Artificial Intelligence
Approximating MAPs for belief networks is NP-hard and other theorems
Artificial Intelligence
Graphical models for machine learning and digital communication
Graphical models for machine learning and digital communication
Bucket elimination: a unifying framework for reasoning
Artificial Intelligence
Topological parameters for time-space tradeoff
Artificial Intelligence
Artificial Intelligence - special issue on computational tradeoffs under bounded resources
Introduction to Bayesian Networks
Introduction to Bayesian Networks
Information filtering using bayesian networks: effective user interfaces for aviation weather data
Proceedings of the 8th international conference on Intelligent user interfaces
Random Generation of Bayesian Networks
SBIA '02 Proceedings of the 16th Brazilian Symposium on Artificial Intelligence: Advances in Artificial Intelligence
Hard and Easy Bayesian Networks for Computing the Most Probable Explanation
Hard and Easy Bayesian Networks for Computing the Most Probable Explanation
Stochastic Greedy Search: Efficiently Computing a Most Probable Explanation in Bayesian Networks
Stochastic Greedy Search: Efficiently Computing a Most Probable Explanation in Bayesian Networks
Efficient bayesian network inference: genetic algorithms, stochastic local search, and abstraction
Efficient bayesian network inference: genetic algorithms, stochastic local search, and abstraction
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Explaining away ambiguity: learning verb selectional preference with Bayesian networks
COLING '00 Proceedings of the 18th conference on Computational linguistics - Volume 1
Exploiting causal independence in Bayesian network inference
Journal of Artificial Intelligence Research
When gravity fails: local search topology
Journal of Artificial Intelligence Research
Approximating MAP using local search
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
Conditioning algorithms for exact and approximate inference in causal networks
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Real time estimation of Bayesian networks
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Search-based methods to bound diagnostic probabilities in very large belief nets
UAI'91 Proceedings of the Seventh conference on Uncertainty in Artificial Intelligence
The distribution of loop lengths in graphical models for turbo decoding
IEEE Transactions on Information Theory
Understanding the role of noise in stochastic local search: Analysis and experiments
Artificial Intelligence
Macroscopic models of clique tree growth for Bayesian networks
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 2
Diagnosing faults in electrical power systems of spacecraft and aircraft
IAAI'08 Proceedings of the 20th national conference on Innovative applications of artificial intelligence - Volume 3
Understanding the scalability of Bayesian network inference using clique tree growth curves
Artificial Intelligence
Journal of Automated Reasoning
Hi-index | 0.00 |
This article presents and analyzes algorithms that systematically generate random Bayesian networks of varying difficulty levels, with respect to inference using tree clustering. The results are relevant to research on efficient Bayesian network inference, such as computing a most probable explanation or belief updating, since they allow controlled experimentation to determine the impact of improvements to inference algorithms. The results are also relevant to research on machine learning of Bayesian networks, since they support controlled generation of a large number of data sets at a given difficulty level. Our generation algorithms, called BPART and MPART, support controlled but random construction of bipartite and multipartite Bayesian networks. The Bayesian network parameters that we vary are the total number of nodes, degree of connectivity, the ratio of the number of non-root nodes to the number of root nodes, regularity of the underlying graph, and characteristics of the conditional probability tables. The main dependent parameter is the size of the maximal clique as generated by tree clustering. This article presents extensive empirical analysis using the Hugin tree clustering approach as well as theoretical analysis related to the random generation of Bayesian networks using BPART and MPART.