Improving classification performance by combining multiple TAN classifiers

  • Authors:
  • Hongbo Shi;Zhihai Wang;Houkuan Huang

  • Affiliations:
  • School of Computer and Information Technology, Northern Jiaotong University, Beijing, China and School of Computer Science and Software Engineering, Monash University, Clayton, Victoria, Australia;School of Computer and Information Technology, Northern Jiaotong University, Beijing, China and School of Computer Science and Software Engineering, Monash University, Clayton, Victoria, Australia;School of Computer and Information Technology, Northern Jiaotong University, Beijing, China and School of Computer Science and Software Engineering, Monash University, Clayton, Victoria, Australia

  • Venue:
  • RSFDGrC'03 Proceedings of the 9th international conference on Rough sets, fuzzy sets, data mining, and granular computing
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Boosting is an effective classifier combination method, which can improve classification performance of an unstable learning algorithm. But it does not have much more improvements on a stable learning algorithm. TAN, Tree-Augmented Naive Bayes, is a tree-like Bayesian network. The standard TAN learning algorithm generates a stable TAN classifier, which is difficult to improve its accuracy by boosting technique. In this paper, a new TAN learning algorithm called GTAN and a TAN classifier combination method called Boosting-MultiTAN are presented. Through comparisons of this TAN classifier combination method with the standard TAN classifier in the experiments, the Boosting-MultiTAN shows higher classification accuracy than the standard TAN classifier on the most data sets.