NP is as easy as detecting unique solutions
Theoretical Computer Science
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Inductive inference of monotonic formal systems from positive data
New Generation Computing - Selected papers from the international workshop on algorithmic learning theory,1990
Causality: models, reasoning, and inference
Causality: models, reasoning, and inference
An introduction to model selection
Journal of Mathematical Psychology
Mind change complexity of learning logic programs
Theoretical Computer Science
Equivalence and synthesis of causal models
UAI '90 Proceedings of the Sixth Annual Conference on Uncertainty in Artificial Intelligence
Optimal structure identification with greedy search
The Journal of Machine Learning Research
Large-Sample Learning of Bayesian Networks is NP-Hard
The Journal of Machine Learning Research
Learning Bayesian Networks
Mind change efficient learning
Information and Computation
Mind change efficient learning
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Review: learning bayesian networks: Approaches and issues
The Knowledge Engineering Review
The IMAP hybrid method for learning gaussian bayes nets
AI'10 Proceedings of the 23rd Canadian conference on Advances in Artificial Intelligence
Hi-index | 0.00 |
This paper analyzes the problem of learning the structure of a Bayes net (BN) in the theoretical framework of Gold's learning paradigm. Bayes nets are one of the most prominent formalisms for knowledge representation and probabilistic and causal reasoning. We follow constraint-based approaches to learning Bayes net structure, where learning is based on observed conditional dependencies between variables of interest (e.g., "X is dependent on Y given any assignment to variable Z"). Applying learning criteria in this model leads to the following results. (1) The mind change complexity of identifying a Bayes net graph over variables V from dependency data is (|V| 2), the maximum number of edges. (2) There is a unique fastest mind-change optimal Bayes net learner; convergence speed is evaluated using Gold's dominance notion of "uniformly faster convergence". This learner conjectures a graph if it is the unique Bayes net pattern that satisfies the observed dependencies with a minimum number of edges, and outputs "no guess" otherwise. Therefore we are using standard learning criteria to define a natural and novel Bayes net learning algorithm. We investigate the complexity of computing the output of the fastest mind-change optimal learner, and show that this problem is NP-hard (assuming P = RP). To our knowledge this is the first NP-hardness result concerning the existence of a uniquely optimal Bayes net structure.