Instance-Based Learning Algorithms
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Feature selection for high-dimensional genomic microarray data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Correlation-based Feature Selection for Discrete and Numeric Class Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Learning Weighted Naive Bayes with Accurate Ranking
ICDM '04 Proceedings of the Fourth IEEE International Conference on Data Mining
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
A decision tree-based attribute weighting filter for naive Bayes
Knowledge-Based Systems
Survey of Improving K-Nearest-Neighbor for Classification
FSKD '07 Proceedings of the Fourth International Conference on Fuzzy Systems and Knowledge Discovery - Volume 01
A Combined Classification Algorithm Based on C4.5 and NB
ISICA '08 Proceedings of the 3rd International Symposium on Advances in Computation and Intelligence
An Empirical Study on Several Classification Algorithms and Their Improvements
ISICA '09 Proceedings of the 4th International Symposium on Advances in Computation and Intelligence
Weightily averaged one-dependence estimators
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
International Journal of Bio-Inspired Computation
Lexicographic multi-objective evolutionary induction of decision trees
International Journal of Bio-Inspired Computation
International Journal of Bio-Inspired Computation
Dynamic k-nearest-neighbor naive bayes with attribute weighted
FSKD'06 Proceedings of the Third international conference on Fuzzy Systems and Knowledge Discovery
Adding memory condition to learning classifier systems to solve partially observable environments
International Journal of Computer Applications in Technology
Active AODE learning based on a novel sampling strategy and its application
International Journal of Computer Applications in Technology
Design and analysis of genetic algorithm based Chinese keyword extracting
International Journal of Computer Applications in Technology
Investigating memetic algorithm in solving rough set attribute reduction
International Journal of Computer Applications in Technology
Research on adaptive classification algorithm based on non-segment and classified-centre-vector
International Journal of Intelligent Information and Database Systems
Research on classification algorithm and its application in cased-based reasoning
International Journal of Computer Applications in Technology
Ontology-based blog collection and profile-based personalised ranking
International Journal of Computer Applications in Technology
Hi-index | 0.00 |
K-nearest-neighbour (KNN) as an important classification method has been widely used in data mining. However, the class probability estimation, the neighbourhood size and the type of distance function confronting KNN may affect its classification accuracy. Many researchers have been focused on improving the accuracy of KNN via distance weighted, attribute weighted, and dynamic selected methods etc. In this paper, we firstly reviewed some improved algorithms of KNN in three categories mentioned above. Then, we singled out an improved algorithm called dynamic KNN with distance and attribute weighted, simply DKNDAW. We experimentally tested our new algorithm in Weka system. In our experiment, we compared it to KNN, WAKNN, KNNDW, KNNDAW, and DKNN. The experimental results show that DKNDAW significantly outperforms other algorithms in terms of the classification accuracy. Besides, how to learn a weighted DKNDAW with accurate ranking from data, or more precisely, different attribute weighted method of DKNDAW can produce accurate ranking. We explore various methods: the gain ratio method, the correlation-based feature selection method, and the decision tree-based method. We concluded that the gain ratio method is more suitable for our improved KNN algorithm DKNDAW.