Systems that learn: an introduction to learning theory for cognitive and computer scientists
Systems that learn: an introduction to learning theory for cognitive and computer scientists
NP is as easy as detecting unique solutions
Theoretical Computer Science
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Inductive inference of monotonic formal systems from positive data
New Generation Computing - Selected papers from the international workshop on algorithmic learning theory,1990
Causality: models, reasoning, and inference
Causality: models, reasoning, and inference
Bayesian Networks and Decision Graphs
Bayesian Networks and Decision Graphs
Learning Bayesian networks from data: an information-theory based approach
Artificial Intelligence
Mind change complexity of learning logic programs
Theoretical Computer Science
Equivalence and synthesis of causal models
UAI '90 Proceedings of the Sixth Annual Conference on Uncertainty in Artificial Intelligence
Optimal structure identification with greedy search
The Journal of Machine Learning Research
Large-Sample Learning of Bayesian Networks is NP-Hard
The Journal of Machine Learning Research
Probabilistic Conditional Independence Structures: With 42 Illustrations (Information Science and Statistics)
Complexity Theory and Cryptology
Complexity Theory and Cryptology
Justification as truth-finding efficiency: how Ockham's Razor works
Minds and Machines - Machine learning as experimental philosophy of science
Mind change efficient learning
Information and Computation
Learning Bayesian Networks
The Journal of Machine Learning Research
Mind change optimal learning: theory and applications
Mind change optimal learning: theory and applications
Critical remarks on single link search in learning belief networks
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Mind change efficient learning
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Hi-index | 0.00 |
This paper analyzes the problem of learning the structure of a Bayes net in the theoretical framework of Gold's learning paradigm. Bayes nets are one of the most prominent formalisms for knowledge representation and probabilistic and causal reasoning. We follow constraint-based approaches to learning Bayes net structure, where learning is based on observed conditional dependencies and independencies between variables of interest (e.g., the data are of the form ''X is dependent on Y given any assignment to variables S'' or of the form ''X is independent of Y given any assignment to variables S''). Applying learning criteria in this model leads to the following results. (1) The mind change complexity of identifying a Bayes net graph over variables V from either dependency data or from independency data are |v|2, the maximum number of edges. (2) There is a unique fastest mind-change optimal Bayes net learner for either data type; convergence speed is evaluated using Gold's dominance notion of ''uniformly faster convergence''. For dependency data, the optimal learner conjectures a graph if it is the unique Bayes net pattern that satisfies the observed dependencies with a minimum number of edges, and outputs ''no guess'' otherwise. For independency data, the optimal learner conjectures a graph if it is the unique Bayes net pattern that satisfies the observed dependencies with a maximum number of edges, and outputs ''no guess'' otherwise. We investigate the complexity of computing the output of the fastest mind-change optimal learner for either data type, and show that each of these two problems is NP-hard (assuming P=RP). To our knowledge these are the first NP-hardness results concerning the existence of a uniquely optimal Bayes net structure.