C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Lazy learning
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Lazy Learning of Bayesian Rules
Machine Learning
Semi-Naive Bayesian Classifier
EWSL '91 Proceedings of the European Working Session on Machine Learning
Naive Bayesian Classifier Committees
ECML '98 Proceedings of the 10th European Conference on Machine Learning
SNNB: A Selective Neighborhood Based Naïve Bayes for Lazy Learning
PAKDD '02 Proceedings of the 6th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Building classifiers using Bayesian networks
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 2
Induction of selective Bayesian classifiers
UAI'94 Proceedings of the Tenth international conference on Uncertainty in artificial intelligence
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Boosting Local Naïve Bayesian Rules
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part II
An evolutionary and attribute-oriented ensemble classifier
ICCSA'06 Proceedings of the 2006 international conference on Computational Science and Its Applications - Volume Part II
Naïve bayesian tree pruning by local accuracy estimation
ADMA'06 Proceedings of the Second international conference on Advanced Data Mining and Applications
Hi-index | 0.00 |
Naïve Bayes, the simplest Bayesian classifier, has shown excellent performance given its unrealistic independence assumption. This paper studies the selective neighborhood-based naïve Bayes (SNNB) for lazy classification, and develops three variant algorithms, SNNB-G, SNNB-L, and SNNB-LV, all with linear computational complexity. The SNNB algorithms use local learning strategy for alleviating the independence assumption. The underlying idea is, for a test example, first to construct multiple classifiers on its multiple neighborhoods with different radius, and then to select out the classifier with the highest estimated accuracy to make decision. Empirical results show that both SNNB-L and SNNB-LV generate more accurate classifiers than naïve Bayes and several other state-of-the-art classification algorithms including C4.5, Naïve Bayes Tree, and Lazy Bayesian Rule. The SNNB-L and SNNB-LV algorithms are also computationally more efficient than the Lazy Bayesian Rule algorithm, especially on the domains with high dimensionality.