C4.5: programs for machine learning
C4.5: programs for machine learning
Lazy learning
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Lazy Learning of Bayesian Rules
Machine Learning
Naive Bayesian Classifier Committees
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Building classifiers using Bayesian networks
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 2
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Comparing Bayesian network classifiers
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
Not so naive Bayes: aggregating one-dependence estimators
Machine Learning
Learning Instance Greedily Cloning Naive Bayes for Ranking
ICDM '05 Proceedings of the Fifth IEEE International Conference on Data Mining
IEEE Transactions on Knowledge and Data Engineering
Survey of Improving Naive Bayes for Classification
ADMA '07 Proceedings of the 3rd international conference on Advanced Data Mining and Applications
Boosting Local Naïve Bayesian Rules
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part II
Anytime learning and classification for online applications
Proceedings of the 2006 conference on Advances in Intelligent IT: Active Media Technology 2006
Supervised Machine Learning: A Review of Classification Techniques
Proceedings of the 2007 conference on Emerging Artificial Intelligence Applications in Computer Engineering: Real Word AI Systems with Applications in eHealth, HCI, Information Retrieval and Pervasive Technologies
Proceedings of the 1st ACM SIGKDD Workshop on Knowledge Discovery from Uncertain Data
Nearest neighbour group-based classification
Pattern Recognition
To select or to weigh: a comparative study of model selection and model weighing for SPODE ensembles
ECML'06 Proceedings of the 17th European conference on Machine Learning
Dynamic k-nearest-neighbor naive bayes with attribute weighted
FSKD'06 Proceedings of the Third international conference on Fuzzy Systems and Knowledge Discovery
Naïve bayesian tree pruning by local accuracy estimation
ADMA'06 Proceedings of the Second international conference on Advanced Data Mining and Applications
Learning k-nearest neighbor naive bayes for ranking
ADMA'05 Proceedings of the First international conference on Advanced Data Mining and Applications
Instance cloning local naive bayes
AI'05 Proceedings of the 18th Canadian Society conference on Advances in Artificial Intelligence
Enhancing SNNB with local accuracy estimation and ensemble techniques
DASFAA'05 Proceedings of the 10th international conference on Database Systems for Advanced Applications
Lazy averaged one-dependence estimators
AI'06 Proceedings of the 19th international conference on Advances in Artificial Intelligence: Canadian Society for Computational Studies of Intelligence
A network intrusion detection system based on a Hidden Naïve Bayes multiclass classifier
Expert Systems with Applications: An International Journal
Boosting for superparent-one-dependence estimators
International Journal of Computing Science and Mathematics
Learning attribute weighted AODE for ROC area ranking
International Journal of Information and Communication Technology
Hi-index | 0.00 |
Na茂ve Bayes is a probability-based classification method which is based on the assumption that attributes are conditionally mutually independent given the class label. Much research has been focused on improving the accuracy of Na茂ve Bayes via eager learning. In this paper, we propose a novel lazy learning algorithm, Selective Neighbourhood based Na茂ve Bayes (SNNB). SNNB computes different distance neighborhoods of the input new object, lazily learns multiple Na茂ve Bayes classifiers, and uses the classifier with the highest estimated accuracy to make decision. The results of our experiments on 26 datasets show that our proposed SNNB algorithm is efficient and it outperforms Na茂ve Bayes, and state-of-the-art classification methods NBTree, CBA, and C4.5 in terms of accuracy.