Knowledge representation and inference in similarity networks and Bayesian multinets
Artificial Intelligence
Machine Learning - Special issue on learning with probabilistic representations
Learning Bayesian network classifiers by maximizing conditional likelihood
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Efficient discriminative learning of Bayesian network classifier via boosted augmented naive Bayes
ICML '05 Proceedings of the 22nd international conference on Machine learning
Discriminative parameter learning for Bayesian networks
Proceedings of the 25th international conference on Machine learning
Boosted Bayesian network classifiers
Machine Learning
Discriminative Learning of Bayesian Networks via Factorized Conditional Log-Likelihood
The Journal of Machine Learning Research
Hi-index | 0.00 |
Using Bayesian networks (BNs) for classification tasks has received significant attention as BNs can encode and represent domain-experts' knowledge as well as data in their structures and conditional probability tables. While structure learning and constructing the structure by hand according to an ensemble of domain-expert opinions are two common approaches to make a BN structure, finding an optimal structure to attain a high correct classification rate -especially for high dimensional problems- is still a challenging task. In this paper we propose a framework - called Local Bayesian Network Experts Fusion (LoBNEF) - in that, instead of making a single network, multiple Bayesian Network Classifiers (BNCs) are built and their outputs are attentively fused. The attentive fusion process is learned interactively using a Bayesian reinforcement learning method. We demonstrate that learning different BNCs in the first step and then fusing their decisions in an attentive and sequential manner is an efficient and robust method in terms of correct classification rate.