A note on genetic algorithms for large-scale feature selection
Pattern Recognition Letters
C4.5: programs for machine learning
C4.5: programs for machine learning
Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dimensionality Reduction in Unsupervised Learning of Conditional Gaussian Networks
IEEE Transactions on Pattern Analysis and Machine Intelligence
Unsupervised Feature Selection Using Feature Similarity
IEEE Transactions on Pattern Analysis and Machine Intelligence
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
When Is ''Nearest Neighbor'' Meaningful?
ICDT '99 Proceedings of the 7th International Conference on Database Theory
Dimensionality Reduction of Unsupervised Data
ICTAI '97 Proceedings of the 9th International Conference on Tools with Artificial Intelligence
An introduction to variable and feature selection
The Journal of Machine Learning Research
Toward Integrating Feature Selection Algorithms for Classification and Clustering
IEEE Transactions on Knowledge and Data Engineering
Efficient Nearest Neighbor Classification Using a Cascade of Approximate Similarity Measures
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Data Mining: Concepts and Techniques
Data Mining: Concepts and Techniques
Neural vs. statistical classifier in conjunction with genetic algorithm based feature selection
Pattern Recognition Letters
Boosting Nearest Neighbor Classi.ers for Multiclass Recognition
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops - Volume 03
SVM-KNN: Discriminative Nearest Neighbor Classification for Visual Category Recognition
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Feature Subset Selection and Ranking for Data Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
IKNN: Informative K-Nearest Neighbor Pattern Classification
PKDD 2007 Proceedings of the 11th European conference on Principles and Practice of Knowledge Discovery in Databases
Dimensionality reduction using genetic algorithms
IEEE Transactions on Evolutionary Computation
Genetic programming for simultaneous feature selection and classifier design
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Fuzzy logic approaches to structure preserving dimensionality reduction
IEEE Transactions on Fuzzy Systems
Neural-network feature selector
IEEE Transactions on Neural Networks
Unsupervised feature evaluation: a neuro-fuzzy approach
IEEE Transactions on Neural Networks
Estimating optimal feature subsets using efficient estimation of high-dimensional mutual information
IEEE Transactions on Neural Networks
Selecting Useful Groups of Features in a Connectionist Framework
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper presents a novel scheme to select a subset of features from a dataset. We apply genetic algorithm (GA) with a random small subset of features. The GA explores stochastically a better subset of features using various combinations of lengths and features over a number of generations. The classification accuracy due to different classifiers in presence of these subsets of features is taken as the performance criteria (objective function) of GA. The proposed scheme is tested on a few UCI datasets. The performances of the KNN, informative KNN (local LI-KNN and global GI-KNN), and LI-KNN with boosting in presence of all features and those in presence of only selected subset of features are compared with reported results. With extensive simulation study, it is observed that the proposed scheme produces a reasonably good accuracy with a reduced subset of features in these datasets.