Theory refinement on Bayesian networks
Proceedings of the seventh conference (1991) on Uncertainty in artificial intelligence
The EM algorithm for graphical association models with missing data
Computational Statistics & Data Analysis - Special issue dedicated to Toma´sˇ Havra´nek
Bayesian classification (AutoClass): theory and results
Advances in knowledge discovery and data mining
Efficient Approximations for the MarginalLikelihood of Bayesian Networks with Hidden Variables
Machine Learning - Special issue on learning with probabilistic representations
Adaptive Probabilistic Networks with Hidden Variables
Machine Learning - Special issue on learning with probabilistic representations
Using Bayesian networks to analyze expression data
RECOMB '00 Proceedings of the fourth annual international conference on Computational molecular biology
Stochastic Complexity in Statistical Inquiry Theory
Stochastic Complexity in Statistical Inquiry Theory
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Being Bayesian about Network Structure
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
On convergence properties of the em algorithm for gaussian mixtures
Neural Computation
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Learning mixtures of DAG models
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Learning Bayesian networks: a unification for discrete and Gaussian domains
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Structure and parameter learning for causal independence and causal interaction models
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
Information-geometric approach to inferring causal directions
Artificial Intelligence
Hi-index | 0.00 |
In this paper we address the problem of learning the structure of a Bayesian network in domains with continuous variables. This task requires a procedure for comparing different candidate structures. In the Bayesian framework, this is done by evaluating the marginal likelihood of the data given a candidate structure. This term can be computed in closed-form for standard parametric families (e.g., Gaussians), and can be approximated, at some computational cost, for some semi-parametric families (e.g., mixtures of Gaussians). We present a new family of continuous variable probabilistic networks that are based on Gaussian Process priors. These priors are semiparametric in nature and can learn almost arbitrary noisy functional relations. Using these priors, we can directly compute marginal likelihoods for structure learning. The resulting method can discover a wide range of functional dependencies in multivariate data. We develop the Bayesian score of Gaussian Process Networks and describe how to learn them from data. We present empirical results on artificial data as well as on real-life domains with non-linear dependencies.