Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Elements of information theory
Elements of information theory
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
Minimum Redundancy Feature Selection from Microarray Gene Expression Data
CSB '03 Proceedings of the IEEE Computer Society Conference on Bioinformatics
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalized Bradley-Terry Models and Multi-Class Probability Estimates
The Journal of Machine Learning Research
Methodology for long-term prediction of time series
Neurocomputing
Information-theoretic feature selection for the classification of hysteresis curves
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
Estimation of the information by an adaptive partitioning of the observation space
IEEE Transactions on Information Theory
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Energy Supervised Relevance Neural Gas for Feature Ranking
Neural Processing Letters
Feature selection for multi-label classification problems
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part I
Feature selection with mutual information for uncertain data
DaWaK'11 Proceedings of the 13th international conference on Data warehousing and knowledge discovery
Feature selection for interpatient supervised heart beat classification
Computational Intelligence and Neuroscience - Special issue on Selected Papers from the 4th International Conference on Bioinspired Systems and Cognitive Signal Processing
Feature selection using dynamic weights for classification
Knowledge-Based Systems
Estimating mutual information for feature selection in the presence of label noise
Computational Statistics & Data Analysis
Hi-index | 0.01 |
The classification of functional or high-dimensional data requires to select a reduced subset of features among the initial set, both to help fighting the curse of dimensionality and to help interpreting the problem and the model. The mutual information criterion may be used in that context, but it suffers from the difficulty of its estimation through a finite set of samples. Efficient estimators are not designed specifically to be applied in a classification context, and thus suffer from further drawbacks and difficulties. This paper presents an estimator of mutual information that is specifically designed for classification tasks, including multi-class ones. It is combined to a recently published stopping criterion in a traditional forward feature selection procedure. Experiments on both traditional benchmarks and on an industrial functional classification problem show the added value of this estimator.