Elements of information theory
Elements of information theory
Nonlinear Markov networks for continuous variables
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Dependency networks for inference, collaborative filtering, and data visualization
The Journal of Machine Learning Research
Learning Factor Graphs in Polynomial Time and Sample Complexity
The Journal of Machine Learning Research
Using modified Lasso regression to learn large undirected graphs in a probabilistic framework
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
Learning graphical model structure using L1-regularization paths
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 2
Learning bayesian network structure from massive datasets: the «sparse candidate« algorithm
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
In many real-world domains, undirected graphical models such as Markov random fields provide a more natural representation of the statistical dependency structure than directed graphical models. Unfortunately, structure learning of undirected graphs using likelihood-based scores remains difficult because of the intractability of computing the partition function. We describe a new Markov random field structure learning algorithm, motivated by canonical parameterization of Abbeel et al. We provide computational improvements on their parameterization by learning per-variable canonical factors, which makes our algorithm suitable for domains with hundreds of nodes. We compare our algorithm against several algorithms for learning undirected and directed models on simulated and real datasets from biology. Our algorithm frequently outperforms existing algorithms, producing higher-quality structures, suggesting that enforcing consistency during structure learning is beneficial for learning undirected graphs.