Machine Learning
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
On Feature Selection: Learning with Exponentially Many Irrelevant Features as Training Examples
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
An introduction to variable and feature selection
The Journal of Machine Learning Research
The Cross Entropy Method: A Unified Approach To Combinatorial Optimization, Monte-carlo Simulation (Information Science and Statistics)
Finding predictive gene groups from microarray data
Journal of Multivariate Analysis
Feature Subset Selection and Feature Ranking for Multivariate Time Series
IEEE Transactions on Knowledge and Data Engineering
Application of global optimization methods to model and feature selection
Pattern Recognition
Hi-index | 0.00 |
Current feature selection methods for supervised classification of tissue samples from microarray data generally fail to exploit complementary discriminatory power that can be found in sets of features [CHECK END OF SENTENCE]. Using a feature selection method with the computational architecture of the cross-entropy method [CHECK END OF SENTENCE], including an additional preliminary step ensuring a lower bound on the number of times any feature is considered, we show when testing on a human lymph node data set that there are a significant number of genes that perform well when their complementary power is assessed, but "pass under the radar” of popular feature selection methods that only assess genes individually on a given classification tool. We also show that this phenomenon becomes more apparent as diagnostic specificity of the tissue samples analysed increases.