Effective Boosting of Naïve Bayesian Classifiers by Local Accuracy Estimation

  • Authors:
  • Zhipeng Xie

  • Affiliations:
  • School of Computer Science, Fudan University, Shanghai, China 200433

  • Venue:
  • PAKDD '09 Proceedings of the 13th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper investigates an effective boosting method for naïve Bayesian classifiers. Existing work has shown that the boosted naïve Bayesian classifier is not so effective in error rate reduction as the boosted decision tree (or boosted decision stump). This phenomenon may be caused by the combination of a couple of facts. To solve the problem, the local accuracies of a naïve Bayesian base classifier should be used to replace the global accuracy (or global error rate) in the traditional boosting methods. Based on the analysis, we propose an effective boosted naïve Bayesian method which uses a C4.5 decision tree as the local-accuracy evaluator for each base classifier. At each round, two classifiers are constructed: one for the naïve Bayesian base classifier, while the other for the C4.5 evaluator. The estimated local accuracy plays an important role, not only in updating the weights of training examples but also in determining the vote weights of base classifiers. Finally, it has been shown by experimental comparison that our method has achieved much lower error rate on average in a set of domains than the AdaBoost.M1 of naïve Bayesian classifiers.