Floating search methods in feature selection
Pattern Recognition Letters
Feature Selection: Evaluation, Application, and Small Sample Performance
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Comparative Analysis of Methods for Pruning Decision Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
New Fast Algorithms for Error Rate-Based Stepwise Variable Selection in Discriminant Analysis
SIAM Journal on Scientific Computing
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
Theoretical and Empirical Analysis of ReliefF and RReliefF
Machine Learning
An introduction to variable and feature selection
The Journal of Machine Learning Research
Fast Branch & Bound Algorithms for Optimal Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Margin based feature selection - theory and algorithms
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Feature Selection for Unsupervised Learning
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Journal of Machine Learning Research
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Analysis of new variable selection methods for discriminant analysis
Computational Statistics & Data Analysis
Spectral feature selection for supervised and unsupervised learning
Proceedings of the 24th international conference on Machine learning
Estimating optimal feature subsets using efficient estimation of high-dimensional mutual information
IEEE Transactions on Neural Networks
Efficient and robust feature extraction by maximum margin criterion
IEEE Transactions on Neural Networks
MCES: A Novel Monte Carlo Evaluative Selection Approach for Objective Feature Selections
IEEE Transactions on Neural Networks
Exploring the boundary region of tolerance rough sets for feature selection
Pattern Recognition
Feature selection with dynamic mutual information
Pattern Recognition
An improvement on floating search algorithms for feature subset selection
Pattern Recognition
Expert Systems with Applications: An International Journal
Feature selection for Bayesian network classifiers using the MDL-FS score
International Journal of Approximate Reasoning
A new dataset evaluation method based on category overlap
Computers in Biology and Medicine
Derivation of an artificial gene to improve classification accuracy upon gene selection
Computational Biology and Chemistry
Pixel selection based on discriminant features with application to face recognition
Pattern Recognition Letters
Feature selection for MAUC-oriented classification systems
Neurocomputing
Efficient feature selection filters for high-dimensional data
Pattern Recognition Letters
A novel divide-and-merge classification for high dimensional datasets
Computational Biology and Chemistry
RFS: Efficient feature selection method based on R-value
Computers in Biology and Medicine
Generalized dual Hahn moment invariants
Pattern Recognition
Hi-index | 0.01 |
The goal of feature selection is to find the optimal subset consisting of m features chosen from the total n features. One critical problem for many feature selection methods is that an exhaustive search strategy has to be applied to seek the best subset among all the possible nm feature subsets, which usually results in a considerably high computational complexity. The alternative suboptimal feature selection methods provide more practical solutions in terms of computational complexity but they cannot promise that the finally selected feature subset is globally optimal. We propose a new feature selection algorithm based on a distance discriminant (FSDD), which not only solves the problem of the high computational costs but also overcomes the drawbacks of the suboptimal methods. The proposed method is able to find the optimal feature subset without exhaustive search or Branch and Bound algorithm. The most difficult problem for optimal feature selection, the search problem, is converted into a feature ranking problem following rigorous theoretical proof such that the computational complexity can be greatly reduced. The proposed method is invariant to the linear transformation of data when a diagonal transformation matrix is applied. FSDD was compared with ReliefF and mrmrMID based on mutual information on 8 data sets. The experiment results show that FSDD outperforms the other two methods and is highly efficient.