Floating search methods in feature selection
Pattern Recognition Letters
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Pairwise classification and support vector machines
Advances in kernel methods
On the Learnability and Design of Output Codes for Multiclass Problems
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
A Classification Based Similarity Metric for 3D Image Retrieval
CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Evaluating Feature Selection Methods for Learning in Data Mining Applications
HICSS '98 Proceedings of the Thirty-First Annual Hawaii International Conference on System Sciences-Volume 5 - Volume 5
Efficient multi-way text categorization via generalized discriminant analysis
CIKM '03 Proceedings of the twelfth international conference on Information and knowledge management
Hybrid Genetic Algorithms for Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection based on rough sets and particle swarm optimization
Pattern Recognition Letters
Two-Step Particle Swarm Optimization to Solve the Feature Selection Problem
ISDA '07 Proceedings of the Seventh International Conference on Intelligent Systems Design and Applications
Subspace based feature selection for pattern recognition
Information Sciences: an International Journal
Comparison among five evolutionary-based optimization algorithms
Advanced Engineering Informatics
An improved GA and a novel PSO-GA-based hybrid algorithm
Information Processing Letters
Principles of Visual Information Retrieval
Principles of Visual Information Retrieval
Dimensionality reduction using genetic algorithms
IEEE Transactions on Evolutionary Computation
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Fast generic selection of features for neural network classifiers
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
AI'12 Proceedings of the 25th Australasian joint conference on Advances in Artificial Intelligence
Hi-index | 12.05 |
The feature selection process constitutes a commonly encountered problem of global combinatorial optimization. This process reduces the number of features by removing irrelevant, noisy, and redundant data, thus resulting in acceptable classification accuracy. Feature selection is a preprocessing technique with great importance in the fields of data analysis and information retrieval processing, pattern classification, and data mining applications. This paper presents a novel optimization algorithm called catfish binary particle swarm optimization (CatfishBPSO), in which the so-called catfish effect is applied to improve the performance of binary particle swarm optimization (BPSO). This effect is the result of the introduction of new particles into the search space (''catfish particles''), which replace particles with the worst fitness by the initialized at extreme points of the search space when the fitness of the global best particle has not improved for a number of consecutive iterations. In this study, the K-nearest neighbor (K-NN) method with leave-one-out cross-validation (LOOCV) was used to evaluate the quality of the solutions. CatfishBPSO was applied and compared to 10 classification problems taken from the literature. Experimental results show that CatfishBPSO simplifies the feature selection process effectively, and either obtains higher classification accuracy or uses fewer features than other feature selection methods.