Instance-Based Learning Algorithms
Machine Learning
Elements of information theory
Elements of information theory
Tolerating noisy, irrelevant and novel attributes in instance-based learning algorithms
International Journal of Man-Machine Studies - Special issue: symbolic problem solving in noisy and novel task environments
C4.5: programs for machine learning
C4.5: programs for machine learning
Rough Sets: Theoretical Aspects of Reasoning about Data
Rough Sets: Theoretical Aspects of Reasoning about Data
Machine Learning
A Technique of Dynamic Feature Selection Using the Feature Group Mutual Information
PAKDD '99 Proceedings of the Third Pacific-Asia Conference on Methodologies for Knowledge Discovery and Data Mining
Hi-index | 0.00 |
The k-nearest neighbor (k-NN) classification is a simple and effective classification approach. However, it suffers from over-sensitivity problem due to irrelevant and noisy features. In this paper, we propose an algorithm to improve the effectiveness of k-NN by combining these two approaches. Specifically, we select all relevant features firstly, and then assign a weight to each one. Experimental results show that our algorithm achieves the highest accuracy or near to the highest accuracy on all test datasets. It also achieves higher generalization accuracy compared with the well-known algorithms IB1-4 and C4.5.