Machine Learning - Special issue on learning with probabilistic representations
Improved boosting algorithms using confidence-rated predictions
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Eighteenth national conference on Artificial intelligence
Multistrategy Ensemble Learning: Reducing Error by Combining Ensemble Learning Techniques
IEEE Transactions on Knowledge and Data Engineering
Learning Bayesian network classifiers by maximizing conditional likelihood
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
A recursive method for discriminative mixture learning
Proceedings of the 24th international conference on Machine learning
Improving multiclass pattern recognition with a co-evolutionary RBFNN
Pattern Recognition Letters
Boosted Bayesian network classifiers
Machine Learning
Generalized additive Bayesian network classifiers
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Bayesian learning of markov network structure
ECML'06 Proceedings of the 17th European conference on Machine Learning
Learning attentive fusion of multiple bayesian network classifiers
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part III
Hi-index | 0.00 |
The use of Bayesian networks for classification problems has received significant recent attention. Although computationally efficient, the standard maximum likelihood learning method tends to be suboptimal due to the mismatch between its optimization criteria (data likelihood) and the actual goal for classification (label prediction). Recent approaches to optimizing the classification performance during parameter or structure learning show promise, but lack the favorable computational properties of maximum likelihood learning. In this paper we present the Boosted Augmented Naive Bayes (BAN) classifier. We show that a combination of discriminative data-weighting with generative training of intermediate models can yield a computationally efficient method for discriminative parameter learning and structure selection.