Machine Learning - Special issue on learning with probabilistic representations
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
SNNB: A Selective Neighborhood Based Naïve Bayes for Lazy Learning
PAKDD '02 Proceedings of the 6th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Survey of Improving Naive Bayes for Classification
ADMA '07 Proceedings of the 3rd international conference on Advanced Data Mining and Applications
Finding the optimal feature representations for Bayesian network learning
PAKDD'07 Proceedings of the 11th Pacific-Asia conference on Advances in knowledge discovery and data mining
Nearest neighbour group-based classification
Pattern Recognition
One Dependence Value Difference Metric
Knowledge-Based Systems
Hybrid dynamic k-nearest-neighbour and distance and attribute weighted method for classification
International Journal of Computer Applications in Technology
Hi-index | 0.00 |
K-Nearest-Neighbor (KNN) has been widely used in classification problems. However, there exist three main problems confronting KNN according to our observation: 1) KNN's accuracy is degraded by a simple vote; 2) KNN's accuracy is typically sensitive to the value of K; 3) KNN's accuracy may be dominated by some irrelevant attributes. In this paper, we presented an improved algorithm called Dynamic K-Nearest-Neighbor Naive Bayes with Attribute Weighted (DKNAW) . We experimentally tested its accuracy, using the whole 36 UCI data sets selected by Weka[1], and compared it to NB, KNN, KNNDW, and LWNB[2]. The experimental results show that DKNAW significantly outperforms NB, KNN, and KNNDW and slightly outperforms LWNB.