Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
The nature of statistical learning theory
The nature of statistical learning theory
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Feature Selection for Knowledge Discovery and Data Mining
Feature Selection for Knowledge Discovery and Data Mining
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rough set methods in feature selection and recognition
Pattern Recognition Letters - Special issue: Rough sets, pattern recognition and data mining
ICCI '04 Proceedings of the Third IEEE International Conference on Cognitive Informatics
Semantics-Preserving Dimensionality Reduction: Rough and Fuzzy-Rough-Based Approaches
IEEE Transactions on Knowledge and Data Engineering
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Rough set feature selection algorithms for textual case-based classification
ECCBR'06 Proceedings of the 8th European conference on Advances in Case-Based Reasoning
Hi-index | 0.00 |
Feature selection is an important issue in machine learning. Rough set theory is one of the important methods for feature selection. In rough set theory, feature selection has already been separately studied in algebra view and information view. Unfortunately, the previously proposed methods based on information entropy for feature selection only focus on the discrete datasets. However, how to effectively discretize the continuous datasets is also full of challenge, since this method may lead to loss of some useful information. To overcome this disadvantage, in this paper, we introduce a novel algorithm based on conditional entropy by clustering strategy for feature selection (ACECFS). In ACECFS, the projected data corresponding to each feature is appropriately separated into several clusters at first, and then the conditional entropy for a set of features is conveniently computed by the clusters and corresponding feature list is generated, hence an effectively relevant and compact feature subset can be obtained from the ranked feature list. Experiments show the effectiveness of ACECFS.