Instance-Based Learning Algorithms
Machine Learning
Machine Learning
A Comparative Analysis of Methods for Pruning Decision Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Artificial Intelligence Review - Special issue on lazy learning
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Lazy Learning of Bayesian Rules
Machine Learning
Machine learning in automated text categorization
ACM Computing Surveys (CSUR)
Mathematics of Generalization: Proceedings: SFI-CNLS Workshop on Formal Approaches to Supervised Learning (1992: Santa Fe, N. M.)
Induction of Recursive Bayesian Classifiers
ECML '93 Proceedings of the European Conference on Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Estimating continuous distributions in Bayesian classifiers
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
In this paper we provide a comprehensive empirical review of a variant of the Recursive Naïve Baye Classifier (RNBC*) in comparison to simple Naïve Bayes and C4.5. We show that in terms of a zero one loss cost function for classification accuracy, RNBC* outperformed Naïve Bayes and was comparable to C4.5, for the range of data-sets tested. As the Naïve Bayes classifier has been shown to be a robust classifier in many domains, this is a significant result. We estimate the bias variance decomposition of RNBC* and show that the bias-variance profile of RNBC* is more similar to that of decision trees than Naïve Bayes. We demonstrate how variance reducing ensemble techniques such as Bagging and Boosting can be effective in increasing the classification accuracy of RNBC*.