Computers and Intractability; A Guide to the Theory of NP-Completeness
Computers and Intractability; A Guide to the Theory of NP-Completeness
Redundant feature elimination for multi-class problems
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Stability of feature selection algorithms: a study on high-dimensional spaces
Knowledge and Information Systems
Minimum reference set based feature selection for small sample classifications
Proceedings of the 24th international conference on Machine learning
Stable feature selection via dense feature groups
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Robust Feature Selection Using Ensemble Feature Selection Techniques
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Consensus group stable feature selection
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Review Article: Stable feature selection for biomarker discovery
Computational Biology and Chemistry
Maximum weight cliques with mutex constraints for video object segmentation
CVPR '12 Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Hi-index | 0.00 |
In this paper, we focus on stable selection of relevant features. The main contribution is a novel framework for selecting most informative features which can preserve the linear combination property of the original feature space. We propose a novel formulation of this problem as selection of a minimal independent dominating set (MIDS). MIDS of a feature graph is a smallest subset such that no two of its nodes are connected and all other nodes are connected to at least one node in it. In this way, the diversity and coverage of the original feature space can be preserved. Furthermore, the proposed MIDS framework complements standard feature selection algorithms like SVM-RFE, stability lasso and ensemble SVM RFE. When these algorithms are applied to feature subsets selected by MIDS as opposed to all the input features, they select more stable features and achieve better prediction accuracy, as our experimental results clearly demonstrate.