Efficient lazy elimination for averaged one-dependence estimators

  • Authors:
  • Fei Zheng;Geoffrey I. Webb

  • Affiliations:
  • Monash University, VIC, Australia;Monash University, VIC, Australia

  • Venue:
  • ICML '06 Proceedings of the 23rd international conference on Machine learning
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Semi-naive Bayesian classifiers seek to retain the numerous strengths of naive Bayes while reducing error by relaxing the attribute independence assumption. Backwards Sequential Elimination (BSE) is a wrapper technique for attribute elimination that has proved effective at this task. We explore a new technique, Lazy Elimination (LE), which eliminates highly related attribute-values at classification time without the computational overheads inherent in wrapper techniques. We analyze the effect of LE and BSE on a state-of-the-art semi-naive Bayesian algorithm Averaged One-Dependence Estimators (AODE). Our experiments show that LE significantly reduces bias and error without undue computation, while BSE significantly reduces bias but not error, with high training time complexity. In the context of AODE, LE has a significant advantage over BSE in both computational efficiency and error.