Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Efficient mining of emerging patterns: discovering trends and differences
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Unsupervised Feature Selection Using Feature Similarity
IEEE Transactions on Pattern Analysis and Machine Intelligence
Making use of the most expressive jumping emerging patterns for classification
Knowledge and Information Systems
Filters, Wrappers and a Boosting-Based Hybrid for Feature Selection
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Identifying Significant Genes from Microarray Data
BIBE '04 Proceedings of the 4th IEEE Symposium on Bioinformatics and Bioengineering
Redundancy based feature selection for microarray data
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
Toward Integrating Feature Selection Algorithms for Classification and Clustering
IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Pattern Analysis and Machine Intelligence
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Gene (feature) selection has been an active research area in microarray analysis. Max-Relevance is one of the criteria which has been broadly used to find features largely correlated to the target class. However, most approximation methods for Max-Relevance do not consider joint effect of features on the target class. We propose a new Max-Relevance criterion which combines the collective impact of the most expressive features in Emerging Patterns (EPs) and some popular independent criteria such as t-test and symmetrical uncertainty. The main benefit of this criterion is that by capturing the joint effect of features using EPs algorithm, it finds the most discriminative features in a broader scope. Experiment results clearly demonstrate that our feature sets improve the class prediction comparing to other feature selections.