A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
Learning Boolean concepts in the presence of many irrelevant features
Artificial Intelligence
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Feature selection based on rough sets and particle swarm optimization
Pattern Recognition Letters
A Direct Method of Nonparametric Measurement Selection
IEEE Transactions on Computers
Multi-objective Feature Selection with NSGA II
ICANNGA '07 Proceedings of the 8th international conference on Adaptive and Natural Computing Algorithms, Part I
Genetic Programming for Feature Subset Ranking in Binary Classification Problems
EuroGP '09 Proceedings of the 12th European Conference on Genetic Programming
Particle swarm optimization based AdaBoost for face detection
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
A non-dominated sorting particle swarm optimizer for multiobjective optimization
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartI
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
Genetic programming for simultaneous feature selection and classifier design
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
On the effectiveness of receptors in recognition systems
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Feature selection (FS) has two main objectives of minimising the number of features and maximising the classification performance. Based on binary particle swarm optimisation (BPSO), we develop a multi-objective FS framework for classification, which is NSBPSO based on multi-objective BPSO using the idea of non-dominated sorting. Two multi-objective FS algorithms are then developed by applying mutual information and entropy as two different filter evaluation criteria in the proposed framework. The two proposed multi-objective algorithms are examined and compared with two single objective FS methods on six benchmark datasets. A decision tree is employed to evaluate the classification accuracy. Experimental results show that the proposed multi-objective algorithms can automatically evolve a set of non-dominated solutions to reduce the number of features and improve the classification performance. Regardless of the evaluation criteria, NSBPSO achieves higher classification performance than the single objective algorithms. NSBPSO with entropy achieves better results than all other methods. This work represents the first study on multi-objective BPSO for filter FS in classification problems.