Learning symmetric causal independence models
Machine Learning
Learning Ground CP-Logic Theories by Leveraging Bayesian Network Learning Techniques
Fundamenta Informaticae - Progress on Multi-Relational Data Mining
Latent classification models for binary data
Pattern Recognition
Learning first-order probabilistic models with combining rules
Annals of Mathematics and Artificial Intelligence
EM algorithm for symmetric causal independence models
ECML'06 Proceedings of the 17th European conference on Machine Learning
Learning Ground CP-Logic Theories by Leveraging Bayesian Network Learning Techniques
Fundamenta Informaticae - Progress on Multi-Relational Data Mining
Hi-index | 0.00 |
I discuss an application of a family of Bayesian network models—known as models of independence of causal influence (ICI)—to classification tasks with large numbers of attributes. An example of such a task is categorization of text documents, in which attributes are single words from the documents. The key that enabled application of the ICI models is their compact representation using a hidden variable. The issue of learning these classifiers by a computationally efficient implementation of the EM algorithm is addressed. Special attention is paid to the noisy-or model—probably the best-known example of an ICI model. The classification using the noisy-or model corresponds to a statistical method known as logistic discrimination. The correspondence is described. Tests of the noisy-or classifier on the Reuters data set show that, despite its simplicity, it has a competitive performance. © 2006 Wiley Periodicals, Inc. Int J Int Syst 21: 381–398, 2006.