Polynomial learnability of probabilistic concepts with respect to the Kullback-Leibler divergence
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Elements of information theory
Elements of information theory
On the Computational Complexity of Approximating Distributions by Probabilistic Automata
Machine Learning - Computational learning theory
Polynomial-time approximation algorithms for the Ising model
SIAM Journal on Computing
Learning and robust learning of product distributions
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Inducing Features of Random Fields
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Sample Complexity of Learning Fixed-Structure Bayesian Networks
Machine Learning - Special issue on learning with probabilistic representations
Learning Markov networks: maximum bounded tree-width graphs
SODA '01 Proceedings of the twelfth annual ACM-SIAM symposium on Discrete algorithms
Probabilistic Networks and Expert Systems
Probabilistic Networks and Expert Systems
Learning Bayesian networks from data: an information-theory based approach
Artificial Intelligence
Maximum Likelihood Bounded Tree-Width Markov Networks
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
Efficient Stepwise Selection in Decomposable Models
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
PAC-learning bounded tree-width graphical models
UAI '04 Proceedings of the 20th conference on Uncertainty in artificial intelligence
Finding a path is harder than finding a tree
Journal of Artificial Intelligence Research
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Finding optimal bayesian networks
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
On the sample complexity of learning Bayesian networks
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Large-sample learning of bayesian networks is NP-hard
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Efficiently inducing features of conditional random fields
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
The Complexity of Distinguishing Markov Random Fields
APPROX '08 / RANDOM '08 Proceedings of the 11th international workshop, APPROX 2008, and 12th international workshop, RANDOM 2008 on Approximation, Randomization and Combinatorial Optimization: Algorithms and Techniques
Reconstruction of Markov Random Fields from Samples: Some Observations and Algorithms
APPROX '08 / RANDOM '08 Proceedings of the 11th international workshop, APPROX 2008, and 12th international workshop, RANDOM 2008 on Approximation, Randomization and Combinatorial Optimization: Algorithms and Techniques
Scaling conditional random fields by one-against-the-other decomposition
Journal of Computer Science and Technology
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Learning structurally consistent undirected probabilistic graphical models
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Efficient structure learning in factored-state MDPs
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Integrative construction and analysis of condition-specific biological networks
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 2
Integrative construction and analysis of condition-specific biological networks
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 3
Efficient Markov network structure discovery using independence tests
Journal of Artificial Intelligence Research
Learning graphical models for hypothesis testing and classification
IEEE Transactions on Signal Processing
A hypothesis test for equality of bayesian network models
EURASIP Journal on Bioinformatics and Systems Biology
Data generation using declarative constraints
Proceedings of the 2011 ACM SIGMOD International Conference on Management of data
Using physicochemical properties of amino acids to induce graphical models of residue couplings
Proceedings of the Tenth International Workshop on Data Mining in Bioinformatics
Learning High-Dimensional Markov Forest Distributions: Analysis of Error Rates
The Journal of Machine Learning Research
High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
The Journal of Machine Learning Research
Parameterized complexity results for exact bayesian network structure learning
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
We study the computational and sample complexity of parameter and structure learning in graphical models. Our main result shows that the class of factor graphs with bounded degree can be learned in polynomial time and from a polynomial number of training examples, assuming that the data is generated by a network in this class. This result covers both parameter estimation for a known network structure and structure learning. It implies as a corollary that we can learn factor graphs for both Bayesian networks and Markov networks of bounded degree, in polynomial time and sample complexity. Importantly, unlike standard maximum likelihood estimation algorithms, our method does not require inference in the underlying network, and so applies to networks where inference is intractable. We also show that the error of our learned model degrades gracefully when the generating distribution is not a member of the target class of networks. In addition to our main result, we show that the sample complexity of parameter learning in graphical models has an O(1) dependence on the number of variables in the model when using the KL-divergence normalized by the number of variables as the performance criterion.