Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection with neural networks
Pattern Recognition Letters
Using a Permutation Test for Attribute Selection in Decision Trees
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Generalized relevance learning vector quantization
Neural Networks - New developments in self-organizing maps
An introduction to variable and feature selection
The Journal of Machine Learning Research
On the Kernel Widths in Radial-Basis Function Networks
Neural Processing Letters
A Feature Selection Newton Method for Support Vector Machine Classification
Computational Optimization and Applications
Fast Binary Feature Selection with Conditional Mutual Information
The Journal of Machine Learning Research
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Representation of functional data in neural networks
Neurocomputing
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Input feature selection for classification problems
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Combined input variable selection and model complexity control for nonlinear regression
Pattern Recognition Letters
Advances in Feature Selection with Mutual Information
Similarity-Based Clustering
Mineral identification using color spaces and artificial neural networks
Computers & Geosciences
Strengthening the Forward Variable Selection Stopping Criterion
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part II
Information-theoretic approaches to SVM feature selection for metagenome read classification
Computational Biology and Chemistry
Feature selection for multi-label classification problems
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part I
Feature selection with mutual information for uncertain data
DaWaK'11 Proceedings of the 13th international conference on Data warehousing and knowledge discovery
Low bias histogram-based estimation of mutual information for feature selection
Pattern Recognition Letters
Computers and Electrical Engineering
Estimating mutual information for feature selection in the presence of label noise
Computational Statistics & Data Analysis
Hi-index | 0.01 |
Combining the mutual information criterion with a forward feature selection strategy offers a good trade-off between optimality of the selected feature subset and computation time. However, it requires to set the parameter(s) of the mutual information estimator and to determine when to halt the forward procedure. These two choices are difficult to make because, as the dimensionality of the subset increases, the estimation of the mutual information becomes less and less reliable. This paper proposes to use resampling methods, a K-fold cross-validation and the permutation test, to address both issues. The resampling methods bring information about the variance of the estimator, information which can then be used to automatically set the parameter and to calculate a threshold to stop the forward procedure. The procedure is illustrated on a synthetic data set as well as on the real-world examples.