Elements of information theory
Elements of information theory
Tolerating noisy, irrelevant and novel attributes in instance-based learning algorithms
International Journal of Man-Machine Studies - Special issue: symbolic problem solving in noisy and novel task environments
C4.5: programs for machine learning
C4.5: programs for machine learning
Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
New Techniques for Data Reduction in a Database System for Knowledge Discovery Applications
Journal of Intelligent Information Systems
Nearest neighbor classifier: simultaneous editing and feature selection
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
Rough set algorithms in classification problem
Rough set methods and applications
From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose
IEEE Transactions on Pattern Analysis and Machine Intelligence
Using Rough Sets with Heuristics for Feature Selection
Journal of Intelligent Information Systems
Intelligent Decision Support: Handbook of Applications and Advances of the Rough Sets Theory
Intelligent Decision Support: Handbook of Applications and Advances of the Rough Sets Theory
Rough Sets: Theoretical Aspects of Reasoning about Data
Rough Sets: Theoretical Aspects of Reasoning about Data
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
An adaptation of Relief for attribute estimation in regression
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Mutual Information in Learning Feature Transformations
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Nonparametric selection of input variables for connectionist learning
Nonparametric selection of input variables for connectionist learning
Consistency-based search in feature selection
Artificial Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection based on rough sets and particle swarm optimization
Pattern Recognition Letters
Exploring the boundary region of tolerance rough sets for feature selection
Pattern Recognition
Extracting Relevant Information about Reduct Sets from Data Tables
Transactions on Rough Sets IX
Feature Selection Algorithm for Multiple Classifier Systems: A Hybrid Approach
Fundamenta Informaticae - Concurrency Specification and Programming (CS&P)
Feature selection with dynamic mutual information
Pattern Recognition
Inhibitory Rules in Data Analysis: A Rough Set Approach
Inhibitory Rules in Data Analysis: A Rough Set Approach
A filter model for feature subset selection based on genetic algorithm
Knowledge-Based Systems
Data mining-based materialized view and index selection in data warehouses
Journal of Intelligent Information Systems
Attribute selection with fuzzy decision reducts
Information Sciences: an International Journal
Normalized mutual information feature selection
IEEE Transactions on Neural Networks
A Distance Measure Approach to Exploring the Rough Set Boundary Region for Attribute Reduction
IEEE Transactions on Knowledge and Data Engineering
Effective feature selection scheme using mutual information
Neurocomputing
Comparison of lazy classification algorithms based on deterministic and inhibitory decision rules
RSKT'08 Proceedings of the 3rd international conference on Rough sets and knowledge technology
Feature selection with fuzzy decision reducts
RSKT'08 Proceedings of the 3rd international conference on Rough sets and knowledge technology
On reduct construction algorithms
Transactions on computational science II
Feature selection with Intelligent Dynamic Swarm and Rough Set
Expert Systems with Applications: An International Journal
Discriminative semi-supervised feature selection via manifold regularization
IEEE Transactions on Neural Networks
Transactions on rough sets XII
A rough set approach to feature selection based on power set tree
Knowledge-Based Systems
Ordered weighted average based fuzzy rough sets
RSKT'10 Proceedings of the 5th international conference on Rough set and knowledge technology
Fuzzy-rough nearest neighbour classification
Transactions on rough sets XIII
Fuzzy-rough nearest neighbour classification and prediction
Theoretical Computer Science
Estimating continuous distributions in Bayesian classifiers
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Irreducible descriptive sets of attributes for information systems
Transactions on Rough Sets XI
Local and global approximations for incomplete data
RSCTC'06 Proceedings of the 5th international conference on Rough Sets and Current Trends in Computing
A model of machine learning based on user preference of attributes
RSCTC'06 Proceedings of the 5th international conference on Rough Sets and Current Trends in Computing
Rough classification in incomplete information systems
Mathematical and Computer Modelling: An International Journal
Input feature selection for classification problems
IEEE Transactions on Neural Networks
Estimating optimal feature subsets using efficient estimation of high-dimensional mutual information
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, a novel feature selection method based on rough sets and mutual information is proposed. The dependency of each feature guides the selection, and mutual information is employed to reduce the features which do not favor addition of dependency significantly. So the dependency of the subset found by our method reaches maximum with small number of features. Since our method evaluates both definitive relevance and uncertain relevance by a combined selection criterion of dependency and class-based distance metric, the feature subset is more relevant than other rough sets based methods. As a result, the subset is near optimal solution. In order to verify the contribution, eight different classification applications are employed. Our method is also employed on a real Alzheimer's disease dataset, and finds a feature subset where classification accuracy arrives at 81.3 %. Those present results verify the contribution of our method.