Small Sample Size Effects in Statistical Pattern Recognition: Recommendations for Practitioners
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical evaluation of rough set dependency analysis
International Journal of Human-Computer Studies
Artificial Intelligence Review - Special issue on lazy learning
Axiomatic Approach to Feature Subset Selection Based on Relevance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rough Sets: Theoretical Aspects of Reasoning about Data
Rough Sets: Theoretical Aspects of Reasoning about Data
Discretization: An Enabling Technique
Data Mining and Knowledge Discovery
Feature Selection via Discretization
IEEE Transactions on Knowledge and Data Engineering
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Correlation-based Feature Selection for Discrete and Numeric Class Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Theoretical and Empirical Analysis of ReliefF and RReliefF
Machine Learning
An introduction to variable and feature selection
The Journal of Machine Learning Research
Consistency-based search in feature selection
Artificial Intelligence
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
Fast Binary Feature Selection with Conditional Mutual Information
The Journal of Machine Learning Research
Toward Integrating Feature Selection Algorithms for Classification and Clustering
IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nearest Neighbors by Neighborhood Counting
IEEE Transactions on Pattern Analysis and Machine Intelligence
Information-preserving hybrid data reduction based on fuzzy-rough techniques
Pattern Recognition Letters
Short communication: Uncertainty measures for fuzzy relations and their applications
Applied Soft Computing
EROS: Ensemble rough subspaces
Pattern Recognition
A parameterless feature ranking algorithm based on MI
Neurocomputing
Attribute reduction in decision-theoretic rough set models
Information Sciences: an International Journal
Neighborhood rough set based heterogeneous feature subset selection
Information Sciences: an International Journal
Knowledge structure, knowledge granulation and knowledge distance in a knowledge base
International Journal of Approximate Reasoning
Input feature selection for classification problems
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Nearest neighbor estimate of conditional mutual information in feature selection
Expert Systems with Applications: An International Journal
Quantitative analysis for covering-based rough sets through the upper approximation number
Information Sciences: an International Journal
Characteristic matrix of covering and its application to Boolean matrix decomposition
Information Sciences: an International Journal
Quality of information-based source assessment and selection
Neurocomputing
Hi-index | 12.05 |
Measures of relevance between features play an important role in classification and regression analysis. Mutual information has been proved an effective measure for decision tree construction and feature selection. However, there is a limitation in computing relevance between numerical features with mutual information due to problems of estimating probability density functions in high-dimensional spaces. In this work, we generalize Shannon's information entropy to neighborhood information entropy and propose a measure of neighborhood mutual information. It is shown that the new measure is a natural extension of classical mutual information which reduces to the classical one if features are discrete; thus the new measure can also be used to compute the relevance between discrete variables. In addition, the new measure introduces a parameter delta to control the granularity in analyzing data. With numeric experiments, we show that neighborhood mutual information produces the nearly same outputs as mutual information. However, unlike mutual information, no discretization is required in computing relevance when used the proposed algorithm. We combine the proposed measure with four classes of evaluating strategies used for feature selection. Finally, the proposed algorithms are tested on several benchmark data sets. The results show that neighborhood mutual information based algorithms yield better performance than some classical ones.