Floating search methods in feature selection
Pattern Recognition Letters
Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Filters, Wrappers and a Boosting-Based Hybrid for Feature Selection
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Feature Selection for Support Vector Machines by Means of Genetic Algorithms
ICTAI '03 Proceedings of the 15th IEEE International Conference on Tools with Artificial Intelligence
Introduction to Evolutionary Computing
Introduction to Evolutionary Computing
Toward Integrating Feature Selection Algorithms for Classification and Clustering
IEEE Transactions on Knowledge and Data Engineering
Class Specific Fuzzy Decision Trees for Mining High Speed Data Streams
Fundamenta Informaticae
Effect of similar behaving attributes in mining of fuzzy association rules in the large databases
ICCSA'06 Proceedings of the 6th international conference on Computational Science and Its Applications - Volume Part I
Genetic programming for simultaneous feature selection and classifier design
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An iterative pruning algorithm for feedforward neural networks
IEEE Transactions on Neural Networks
Harmony-based feature selection to improve the nearest neighbor classification
Proceedings of the Second International Conference on Computational Science, Engineering and Information Technology
ICCCI'12 Proceedings of the 4th international conference on Computational Collective Intelligence: technologies and applications - Volume Part II
Hi-index | 0.00 |
This paper introduces the use of GA with a novel fitness function to eliminate noisy and irrelevant features. Fitness function of GA is based on the Area Under the receiver operating characteristics Curve (AUC). The aim of this feature selection is to improve the performance of k-NN algorithm. Experimental results show that the proposed method can substantially improve the classification performance of k-NN algorithm in comparison with the other classifiers (in the realm of feature selection) such as C4.5, SVM, and Relief. Furthermore,this method is able to eliminate the noisy irrelevant features from the synthetic data sets.