Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Machine Learning
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Axiomatic Approach to Feature Subset Selection Based on Relevance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Complexity Measures of Supervised Classification Problems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Correlation-based Feature Selection for Discrete and Numeric Class Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
An introduction to variable and feature selection
The Journal of Machine Learning Research
Consistency-based search in feature selection
Artificial Intelligence
Margin based feature selection - theory and algorithms
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
Toward Integrating Feature Selection Algorithms for Classification and Clustering
IEEE Transactions on Knowledge and Data Engineering
Combined SVM-Based Feature Selection and Classification
Machine Learning
Iterative RELIEF for feature weighting
ICML '06 Proceedings of the 23rd international conference on Machine learning
Expert Systems with Applications: An International Journal
Neighborhood rough set based heterogeneous feature subset selection
Information Sciences: an International Journal
Feature selection based on loss-margin of nearest neighbor classification
Pattern Recognition
Feature selection by analyzing class regions approximated byellipsoids
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.00 |
Feature selection is considered to be a key preprocessing step in machine learning and pattern recognition. Feature evaluation is one of the key issues for constructing a feature selection algorithm. In this work, we propose a new concept of neighborhood margin and neighborhood soft margin to measure the minimal distance between different classes. We use the criterion of neighborhood soft margin to evaluate the quality of candidate features and construct a forward greedy algorithm for feature selection. We conduct this technique on eight classification learning tasks. Compared with the raw data and other three feature selection algorithms, the proposed technique is effective in most of the cases.