A Bayesian network classifier that combines a finite mixture model and a naïve bayes model

  • Authors:
  • Stefano Monti;Gregory F. Cooper

  • Affiliations:
  • Intelligent Systems Program, University of Pittsburgh, Pittsburgh, PA;Intelligent Systems Program, University of Pittsburgh, Pittsburgh, PA and Center for Biomedical Informatics, University of Pittsburgh, Pittsburgh, PA

  • Venue:
  • UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we present a new Bayesian network model for classification that combines the naive Bayes (NB) classifier and the finite mixture (FM) classifier. The resulting classifier aims at relaxing the strong assumptions on which the two component models are based, in an attempt to improve on their classification performance, both in terms of accuracy and in terms of calibration of the estimated probabilities. The proposed classifier is obtained by superimposing a finite mixture model on the set of feature variables of a naive Bayes model.. We present experimental results that compare the predictive performance on real datasets of the new classifier with the predictive performance of the NB classifier and the FM classifier.