C4.5: programs for machine learning
C4.5: programs for machine learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Improving the Performance of Boosting for Naive Bayesian Classification
PAKDD '99 Proceedings of the Third Pacific-Asia Conference on Methodologies for Knowledge Discovery and Data Mining
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Hi-index | 0.00 |
This paper investigates an effective boosting method for naïve Bayesian classifiers. Existing work has shown that the boosted naïve Bayesian classifier is not so effective in error rate reduction as the boosted decision tree (or boosted decision stump). This phenomenon may be caused by the combination of a couple of facts. To solve the problem, the local accuracies of a naïve Bayesian base classifier should be used to replace the global accuracy (or global error rate) in the traditional boosting methods. Based on the analysis, we propose an effective boosted naïve Bayesian method which uses a C4.5 decision tree as the local-accuracy evaluator for each base classifier. At each round, two classifiers are constructed: one for the naïve Bayesian base classifier, while the other for the C4.5 evaluator. The estimated local accuracy plays an important role, not only in updating the weights of training examples but also in determining the vote weights of base classifiers. Finally, it has been shown by experimental comparison that our method has achieved much lower error rate on average in a set of domains than the AdaBoost.M1 of naïve Bayesian classifiers.