The complexity of Boolean functions
The complexity of Boolean functions
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
A munin network for the median nerve-a case study on loops
Applied Artificial Intelligence
A view of the EM algorithm that justifies incremental, sparse, and other variants
Learning in graphical models
On Comparing Classifiers: Pitfalls toAvoid and a Recommended Approach
Data Mining and Knowledge Discovery
Convex Optimization
Noisy-or classifier: Research Articles
International Journal of Intelligent Systems - Uncertainty Processing
Bayesian network modelling through qualitative patterns
Artificial Intelligence
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Exploiting causal independence in Bayesian network inference
Journal of Artificial Intelligence Research
Structure and parameter learning for causal independence and causal interaction models
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
Improving the therapeutic performance of a medical bayesian network using noisy threshold models
ISBMDA'05 Proceedings of the 6th International conference on Biological and Medical Data Analysis
Review: learning bayesian networks: Approaches and issues
The Knowledge Engineering Review
Probabilistic inference with noisy-threshold models based on a CP tensor decomposition
International Journal of Approximate Reasoning
Hi-index | 0.00 |
Causal independence modelling is a well-known method for reducing the size of probability tables, simplifying the probabilistic inference and explaining the underlying mechanisms in Bayesian networks. Recently, a generalization of the widely-used noisy OR and noisy AND models, causal independence models based on symmetric Boolean functions, was proposed. In this paper, we study the problem of learning the parameters in these models, further referred to as symmetric causal independence models. We present a computationally efficient EM algorithm to learn parameters in symmetric causal independence models, where the computational scheme of the Poisson binomial distribution is used to compute the conditional probabilities in the E-step. We study computational complexity and convergence of the developed algorithm. The presented EM algorithm allows us to assess the practical usefulness of symmetric causal independence models. In the assessment, the models are applied to a classification task; they perform competitively with state-of-the-art classifiers.