Improved boosting algorithms using confidence-rated predictions
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Lazy Learning of Bayesian Rules
Machine Learning
SNNB: A Selective Neighborhood Based Naïve Bayes for Lazy Learning
PAKDD '02 Proceedings of the 6th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Naïve bayesian tree pruning by local accuracy estimation
ADMA'06 Proceedings of the Second international conference on Advanced Data Mining and Applications
Enhancing SNNB with local accuracy estimation and ensemble techniques
DASFAA'05 Proceedings of the 10th international conference on Database Systems for Advanced Applications
Hi-index | 0.01 |
Several classification algorithms based on local naïve Bayesian rules have been recently developed to provide high predictability. However, most of them use classifier selection strategy in decision making. To make use of classifier fusion strategy, this paper investigates a boosting algorithm for local naïve Bayesian rules. Firstly, we develop an algorithmic framework as a forward stage-wise additive model. Then, a construction algorithm for lazy naïve Bayesian rules is designed to materialize the algorithmic framework. The construction algorithm starts from the most general rule, and uses a greedy search to grow the antecedent repeatedly in order to get a better rule at each step. Experimental results show that the proposed method has successfully reduced the overall error rate on a variety of domains, compared with boosted naïve Bayesian classifier, and lazy Bayesian rule algorithm.