Weightily averaged one-dependence estimators

  • Authors:
  • Liangxiao Jiang;Harry Zhang

  • Affiliations:
  • Faculty of Computer Science, China University of Geosciences, Wuhan, Hubei, P.R. China;Faculty of Computer Science, University of New Brunswick, Fredericton, NB, Canada

  • Venue:
  • PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

NB(naive Bayes) is a probabilistic classification model, which is based on the attribute independence assumption. However, in many real-world data mining applications, this assumption is often violated. Responding to this fact, researchers have made a substantial amount of effort to improve NB's accuracy by weakening its attribute independence assumption. For a recent example, Webb et al.[1] propose a model called Averaged One-Dependence Estimators, simply AODE, which weakens the attribute independence assumption by averaging all models from a restricted class of one-dependence classifiers. Motivated by their work, we believe that assigning different weights to these one-dependence classifiers can result in significant improvement. Based on this belief, we present an improved algorithm called Weightily Averaged One-Dependence Estimators, simply WAODE. We experimentally tested our algorithm in Weka system[2], using the whole 36 UCI data sets[3] selected by Weka[2], and compared it to NB, SBC[4], TAN [5], NBTree[6], and AODE[1]. The experimental results show that WAODE significantly outperforms all the other algorithms used to compare.