Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
An analysis of particle swarm optimizers
An analysis of particle swarm optimizers
An introduction to variable and feature selection
The Journal of Machine Learning Research
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Feature selection based on rough sets and particle swarm optimization
Pattern Recognition Letters
A Direct Method of Nonparametric Measurement Selection
IEEE Transactions on Computers
Improved binary PSO for feature selection using gene expression data
Computational Biology and Chemistry
A distributed PSO-SVM hybrid system with feature selection and parameter optimization
Applied Soft Computing
Genetic Programming for Feature Subset Ranking in Binary Classification Problems
EuroGP '09 Proceedings of the 12th European Conference on Genetic Programming
AICCSA '08 Proceedings of the 2008 IEEE/ACS International Conference on Computer Systems and Applications
Java-ML: A Machine Learning Library
The Journal of Machine Learning Research
The particle swarm - explosion, stability, and convergence in amultidimensional complex space
IEEE Transactions on Evolutionary Computation
On the effectiveness of receptors in recognition systems
IEEE Transactions on Information Theory
Hi-index | 0.00 |
This paper proposes two wrapper based feature selection approaches, which are single feature ranking and binary particle swarm optimisation (BPSO) based feature subset ranking. In the first approach, individual features are ranked according to the classification accuracy so that feature selection can be accomplished by using only a few top-ranked features for classification. In the second approach, BPSO is applied to feature subset ranking to search different feature subsets. K-nearest neighbour (KNN) with n-fold cross-validation is employed to evaluate the classification accuracy on eight datasets in the experiments. Experimental results show that using a relatively small number of the top-ranked features obtained from the first approach or one of the top-ranked feature subsets obtained from the second approach can achieve better classification performance than using all features. BPSO could efficiently search for subsets of complementary features to avoid redundancy and noise. Compared with linear forward selection (LFS) and greedy stepwise backward selection (GSBS), in almost all cases, the two proposed approaches could achieve better performance in terms of classification accuracy and the number of features. The BPSO based approach outperforms single feature ranking approach for all the datasets.