A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Feature Selection: Evaluation, Application, and Small Sample Performance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Theoretical and Empirical Analysis of ReliefF and RReliefF
Machine Learning
Lagrangian support vector machines
The Journal of Machine Learning Research
An introduction to variable and feature selection
The Journal of Machine Learning Research
Use of the zero norm with linear models and kernel methods
The Journal of Machine Learning Research
Margin based feature selection - theory and algorithms
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Minimum Bayes Error Features for Visual Recognition by Sequential Feature Selection and Extraction
CRV '05 Proceedings of the 2nd Canadian conference on Computer and Robot Vision
Feature Extraction Using Information-Theoretic Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Feature Selection based on the Bhattacharyya Distance
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
Iterative RELIEF for Feature Weighting: Algorithms, Theories, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection by maximum marginal diversity: optimality and implications for visual recognition
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
Efficient feature selection in the presence of outliers and noises
AIRS'08 Proceedings of the 4th Asia information retrieval conference on Information retrieval technology
Language pyramid and multi-scale text analysis
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
Hi-index | 0.00 |
This paper presents an algorithmic framework for feature selection, which selects a subset of features by minimizing the nonparametric Bayes error. A set of existing algorithms as well as new ones can be derived naturally from this framework. For example, we show that the Relief algorithm greedily attempts to minimize the Bayes error estimated by k-Nearest-Neighbor method. This new interpretation not only reveals the secret behind Relief but also offers various opportunities to improve it or to establish new alternatives. In particular, we develop a new feature weighting algorithm, named Parzen-Relief, which minimizes the Bayes error estimated by Parzen method. Additionally, to enhance its ability to handle imbalanced and multiclass data, we integrate the class distribution with the max-margin objective function, leading to a new algorithm, named MAP-Relief. Comparison on benchmark data sets confirms the effectiveness of the proposed algorithms.