A note on genetic algorithms for large-scale feature selection
Pattern Recognition Letters
Using genetic algorithms to improve pattern classification performance
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Machine Learning for the Detection of Oil Spills in Satellite Radar Images
Machine Learning - Special issue on applications of machine learning and the knowledge discovery process
Nearest neighbor classifier: simultaneous editing and feature selection
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Principles of data mining
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Data Mining and Knowledge Discovery
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Design of an optimal nearest neighbor classifier using an intelligent genetic algorithm
Pattern Recognition Letters
Improving Minority Class Prediction Using Case-Specific Feature Weights
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Feature Selection for Unbalanced Class Distribution and Naive Bayes
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
SMOTE: synthetic minority over-sampling technique
Journal of Artificial Intelligence Research
Evolutionary data analysis for the class imbalance problem
Intelligent Data Analysis
Hi-index | 0.01 |
Despite its simplicity and good classification performance, the Nearest Neighbor (NN) rule is not applied in many practical tasks because of the high amount of computational resources that it requires. Besides, when working with imbalanced training samples, its classification accuracy can be seriously degraded. In the present paper we propose two genetic algorithms to cope with these two issues. The purpose is to obtain complexity reduction while at the same time, to get a better balance in the training sample. Experimental results showing the benefits of our proposals are also reported.