Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Instance-Based Learning Algorithms
Machine Learning
A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems
Theoretical Computer Science
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Correlation-based Feature Selection for Discrete and Numeric Class Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Minimum Redundancy Feature Selection from Microarray Gene Expression Data
CSB '03 Proceedings of the IEEE Computer Society Conference on Bioinformatics
Theoretical and Empirical Analysis of ReliefF and RReliefF
Machine Learning
An introduction to variable and feature selection
The Journal of Machine Learning Research
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Feature selection with conditional mutual information maximin in text categorization
Proceedings of the thirteenth ACM international conference on Information and knowledge management
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
Fast Binary Feature Selection with Conditional Mutual Information
The Journal of Machine Learning Research
Toward Integrating Feature Selection Algorithms for Classification and Clustering
IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Pattern Analysis and Machine Intelligence
A New Dependency and Correlation Analysis for Features
IEEE Transactions on Knowledge and Data Engineering
Speculative Markov Blanket Discovery for Optimal Feature Selection
ICDM '05 Proceedings of the Fifth IEEE International Conference on Data Mining
Towards scalable and data efficient learning of Markov boundaries
International Journal of Approximate Reasoning
Supervised feature selection via dependence estimation
Proceedings of the 24th international conference on Machine learning
Spectral feature selection for supervised and unsupervised learning
Proceedings of the 24th international conference on Machine learning
A parameterless feature ranking algorithm based on MI
Neurocomputing
Effective feature selection scheme using mutual information
Neurocomputing
The Journal of Machine Learning Research
Collaborative filtering with the simple Bayesian classifier
PRICAI'00 Proceedings of the 6th Pacific Rim international conference on Artificial intelligence
Input feature selection for classification problems
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Divergence-based feature selection for separate classes
Neurocomputing
An information theoretic sparse kernel algorithm for online learning
Expert Systems with Applications: An International Journal
Hi-index | 12.05 |
Feature selection is one of the core issues in designing pattern recognition and machine learning systems, and has attracted considerable attention in the literature. In this paper, a new feature subset selection algorithm with conditional mutual information is proposed, which firstly guarantees to find a subset of which the mutual information with the class is the same as that of the original set of features, and then eliminates potential redundant features from the view of minimal information loss based on the cumulate conditional mutual information minimization criterion. From the reliability point of view, this criterion can also abate the disturbance caused by sample insufficiency in conditional mutual information estimation. In addition, a fast implementation of conditional mutual information estimation is proposed and used to tackle the computationally intractable problem. Empirical results verify that our algorithm is efficient and achieves better accuracy than several representative feature selection algorithms for three typical classifiers on various datasets.