Stable local computation with conditional Gaussian distributions
Statistics and Computing
Mixtures of Truncated Exponentials in Hybrid Bayesian Networks
ECSQARU '01 Proceedings of the 6th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
On the Representation of Probabilities over Structured Domains
CAV '99 Proceedings of the 11th International Conference on Computer Aided Verification
Probabilistic decision graphs-combining verification and AI techniques for probabilistic inference
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems - New trends in probabilistic graphical models
Supervised classification using probabilistic decision graphs
Computational Statistics & Data Analysis
The PDG-Mixture Model for Clustering
DaWaK '09 Proceedings of the 11th International Conference on Data Warehousing and Knowledge Discovery
Learning hybrid Bayesian networks using mixtures of truncated exponentials
International Journal of Approximate Reasoning
Learning probabilistic decision graphs
International Journal of Approximate Reasoning
Inference in hybrid Bayesian networks with mixtures of truncated exponentials
International Journal of Approximate Reasoning
Probabilistic Graphical Models: Principles and Techniques - Adaptive Computation and Machine Learning
Parameter estimation and model selection for mixtures of truncated exponentials
International Journal of Approximate Reasoning
Structural-EM for learning PDG models from incomplete data
International Journal of Approximate Reasoning
Inference in hybrid Bayesian networks using mixtures of polynomials
International Journal of Approximate Reasoning
Hi-index | 0.00 |
Probabilistic Decision Graphs (PDGs) are probabilistic graphical models that represent a factorisation of a discrete joint probability distribution using a ''decision graph''-like structure over local marginal parameters. The structure of a PDG enables the model to capture some context specific independence relations that are not representable in the structure of more commonly used graphical models such as Bayesian networks and Markov networks. This sometimes makes operations in PDGs more efficient than in alternative models. PDGs have previously been defined only in the discrete case, assuming a multinomial joint distribution over the variables in the model. We extend PDGs to incorporate continuous variables, by assuming a Conditional Gaussian (CG) joint distribution. We also show how inference can be carried out in an efficient way.