Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
BoosTexter: A Boosting-based Systemfor Text Categorization
Machine Learning - Special issue on information retrieval
Scaling up the accuracy of Bayesian classifier based on frequent itemsets by m-estimate
AICI'10 Proceedings of the 2010 international conference on Artificial intelligence and computational intelligence: Part I
NB+: An improved Naïve Bayesian algorithm
Knowledge-Based Systems
Hi-index | 0.00 |
In this paper, we introduce a new method to improve the performance of combining boosting and na茂ve Bayesian. Instead of combining boosting and Na茂ve Bayesian learning directly, which was proved to be unstatisfactory to improve performance, we select the training samples dynamically by bootstrap method for the construction of na茂ve Bayesian classifiers, and hence generate very different or unstable base classifiers for boosting. Besides, we devise a modification for the weight adjusting of boosting algorithm in order to achieve this goal: minimizing the overlapping errors of its constituent classfiers. We conducted series of experiments, which show that the new method not only has performance much better than na茂ve Bayesian classifiers or directly boosted na茂ve Bayesian ones, but also much quicker to obtain optimal performance than boosting stumps and boosting decision trees incorporated with na茂ve Bayesian learning.