Elements of information theory
Elements of information theory
Floating search methods in feature selection
Pattern Recognition Letters
Feature Selection: Evaluation, Application, and Small Sample Performance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems
Theoretical Computer Science
Axiomatic Approach to Feature Subset Selection Based on Relevance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive floating search methods in feature selection
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
Feature Selection for Knowledge Discovery and Data Mining
Feature Selection for Knowledge Discovery and Data Mining
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature Subset Selection Using a Genetic Algorithm
IEEE Intelligent Systems
Feature selection with neural networks
Pattern Recognition Letters
Rough set methods in feature selection and recognition
Pattern Recognition Letters - Special issue: Rough sets, pattern recognition and data mining
A fuzzy neural network for pattern classification and feature selection
Fuzzy Sets and Systems
Feature selection based on a modified fuzzy C-means algorithm with supervision
Information Sciences—Informatics and Computer Science: An International Journal
An introduction to variable and feature selection
The Journal of Machine Learning Research
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
A Compact and Accurate Model for Classification
IEEE Transactions on Knowledge and Data Engineering
Consistency-based search in feature selection
Artificial Intelligence
Fast Branch & Bound Algorithms for Optimal Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Large Scale Feature Selection Using Modified Random Mutation Hill Climbing
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 2 - Volume 02
Hybrid Genetic Algorithms for Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
On fuzzy-rough sets approach to feature selection
Pattern Recognition Letters
A Mathematical Theory of Communication
A Mathematical Theory of Communication
A Branch and Bound Algorithm for Feature Subset Selection
IEEE Transactions on Computers
Effective feature selection scheme using mutual information
Neurocomputing
The coefficient of intrinsic dependence (feature selection using el CID)
Pattern Recognition
Probability of error, equivocation, and the Chernoff bound
IEEE Transactions on Information Theory
Input feature selection for classification problems
IEEE Transactions on Neural Networks
A neuro-fuzzy scheme for simultaneous feature selection and fuzzy rule-based classification
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Feature subset selection with cumulate conditional mutual information minimization
Expert Systems with Applications: An International Journal
Analysis of feature weighting methods based on feature ranking methods for classification
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
Divergence-based feature selection for separate classes
Neurocomputing
ACSC '11 Proceedings of the Thirty-Fourth Australasian Computer Science Conference - Volume 113
Hi-index | 0.01 |
A parameterless feature ranking approach is presented for feature selection in the pattern classification task. Compared with Battiti's mutual information feature selection (MIFS) and Kwak and Choi's MIFS-U methods, the proposed method derives an estimation of the conditional MI between the candidate feature f"i and the output class C given the subset of selected features S, i.e. I(C;f"i|S), without any parameters like @b in MIFS and MIFS-U methods to be preset. Thus, the intractable problem can be avoided completely, which is how to choose an appropriate value for @b to achieve the tradeoff between the relevance to the output classes and the redundancy with the already-selected features. Furthermore, a modified greedy feature selection algorithm called the second order MI feature selection approach (SOMIFS) is proposed. Experimental results demonstrate the superiority of SOMIFS in terms of both synthetic and benchmark data sets.