A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
Rough set approach to incomplete information systems
Information Sciences: an International Journal
Unsupervised Feature Selection Using Feature Similarity
IEEE Transactions on Pattern Analysis and Machine Intelligence
Maximal consistent block technique for rule acquisition in incomplete information systems
Information Sciences: an International Journal
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Mixed feature selection based on granulation and approximation
Knowledge-Based Systems
Neighborhood rough set based heterogeneous feature subset selection
Information Sciences: an International Journal
Review: Dimensionality reduction based on rough set theory: A review
Applied Soft Computing
Information Sciences: an International Journal
Selecting discrete and continuous features based on neighborhood decision error minimization
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Combination entropy and combination granulation in incomplete information system
RSKT'06 Proceedings of the First international conference on Rough Sets and Knowledge Technology
Discriminative Feature Selection by Nonparametric Bayes Error Minimization
IEEE Transactions on Knowledge and Data Engineering
Feature selection using rough entropy-based uncertainty measures in incomplete decision systems
Knowledge-Based Systems
Attribute selection based on a new conditional entropy for incomplete decision systems
Knowledge-Based Systems
On Similarity Preserving Feature Selection
IEEE Transactions on Knowledge and Data Engineering
Entropy measures and granularity measures for set-valued information systems
Information Sciences: an International Journal
Rough set approach to incomplete numerical data
Information Sciences: an International Journal
Hi-index | 0.00 |
Feature selection in incomplete decision table has gained considerable attention in recently. However many feature selection methods are mainly designed for incomplete data with categorical features. In this paper, we introduce an extended rough set model, which is based on neighborhood-tolerance relation and is applicable to incomplete data with mixed categorical and numerical features. Neighborhood-tolerance conditional entropy is proposed from this model, which is an uncertainty measure and can be used to evaluate feature subset. It is known that dependency is an important feature evaluation measure based on rough set theory. The comparison and analysis of classification complexity are made between the two measures and it is indicated that neighborhood-tolerance conditional entropy is a more effective feature evaluation criterion than dependency in incomplete decision table. Then the heuristic feature selection algorithm based on neighborhood-tolerance conditional entropy is constructed. Experimental results show that our proposal is applicable and effective to incomplete mixed data.