Artificial Intelligence
Learning Boolean concepts in the presence of many irrelevant features
Artificial Intelligence
Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Axiomatic Approach to Feature Subset Selection Based on Relevance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rough set methods in feature selection and recognition
Pattern Recognition Letters - Special issue: Rough sets, pattern recognition and data mining
Fundamenta Informaticae
Theoretical and Empirical Analysis of ReliefF and RReliefF
Machine Learning
Consistency-based search in feature selection
Artificial Intelligence
Margin based feature selection - theory and algorithms
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Hybrid Genetic Algorithms for Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Semantics-Preserving Dimensionality Reduction: Rough and Fuzzy-Rough-Based Approaches
IEEE Transactions on Knowledge and Data Engineering
Toward Integrating Feature Selection Algorithms for Classification and Clustering
IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Information-preserving hybrid data reduction based on fuzzy-rough techniques
Pattern Recognition Letters
Iterative RELIEF for feature weighting
ICML '06 Proceedings of the 23rd international conference on Machine learning
A Branch and Bound Algorithm for Feature Subset Selection
IEEE Transactions on Computers
Expert Systems with Applications: An International Journal
Genetic algorithm-based feature selection in high-resolution NMR spectra
Expert Systems with Applications: An International Journal
Attribute reduction in decision-theoretic rough set models
Information Sciences: an International Journal
Neighborhood rough set based heterogeneous feature subset selection
Information Sciences: an International Journal
Information Sciences: an International Journal
Feature selection based on loss-margin of nearest neighbor classification
Pattern Recognition
A dependency-based search strategy for feature selection
Expert Systems with Applications: An International Journal
A Robust Adaptive Version of Evidence-Theoretic k-NN Classification Rule
FSKD '09 Proceedings of the 2009 Sixth International Conference on Fuzzy Systems and Knowledge Discovery - Volume 04
Selecting discrete and continuous features based on neighborhood decision error minimization
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Expert Systems with Applications: An International Journal
Consistency based attribute reduction
PAKDD'07 Proceedings of the 11th Pacific-Asia conference on Advances in knowledge discovery and data mining
Feature Selection via Maximizing Fuzzy Dependency
Fundamenta Informaticae
An evidence-theoretic k-NN rule with parameter optimization
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Genetic programming for simultaneous feature selection and classifier design
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Input feature selection for classification problems
IEEE Transactions on Neural Networks
Hi-index | 12.05 |
Feature selection is an important preprocessing step in pattern recognition and machine learning, and feature evaluation arises as key issues in the construction of feature selection algorithms. In this study, we introduce a new concept of neighborhood evidential decision error to evaluate the quality of candidate features and construct a greedy forward algorithm for feature selection. This technique considers both the Bayes error rate of classification and spatial information of samples in the decision boundary regions. Within the decision boundary regions, each sample x"i in the neighborhood of x provides a piece of evidence reflecting the decision of x so as to separate the decision boundary regions into two subsets: recognizable and misclassified regions. The percentage of misclassified samples is viewed as the Bayes error rate of classification in the corresponding feature subspaces. By minimizing the neighborhood evidential decision error (i.e., Bayes error rate), the optimal feature subsets of raw data set can be selected. Some numerical experiments were conducted to validate the proposed technique by using nine UCI classification datasets. The experimental results showed that this technique is effective in most of the cases, and is insensitive to the size of neighborhood comparing with other feature evaluation functions such as the neighborhood dependency.