Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Floating search methods in feature selection
Pattern Recognition Letters
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Unsupervised Feature Selection Using Feature Similarity
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature Selection for Knowledge Discovery and Data Mining
Feature Selection for Knowledge Discovery and Data Mining
Text Categorization with Suport Vector Machines: Learning with Many Relevant Features
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Feature selection for high-dimensional genomic microarray data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Correlation-based Feature Selection for Discrete and Numeric Class Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Filters, Wrappers and a Boosting-Based Hybrid for Feature Selection
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
An improved branch and bound algorithm for feature selection
Pattern Recognition Letters
Minimum Redundancy Feature Selection from Microarray Gene Expression Data
CSB '03 Proceedings of the IEEE Computer Society Conference on Bioinformatics
An introduction to variable and feature selection
The Journal of Machine Learning Research
An extensive empirical study of feature selection metrics for text classification
The Journal of Machine Learning Research
Fast Branch & Bound Algorithms for Optimal Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Discriminative features for text document classification
Pattern Analysis & Applications
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
Dimension Reduction in Text Classification with Support Vector Machines
The Journal of Machine Learning Research
Sparse Multinomial Logistic Regression: Fast Algorithms and Generalization Bounds
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Advances in Neural Information Processing Systems 18: Proceedings of the 2005 Conference (Neural Information Processing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Spectral feature selection for supervised and unsupervised learning
Proceedings of the 24th international conference on Machine learning
Adaptive branch and bound algorithm for selecting optimal features
Pattern Recognition Letters
Information Theory in Computer Vision and Pattern Recognition
Information Theory in Computer Vision and Pattern Recognition
IITA '09 Proceedings of the 2009 Third International Symposium on Intelligent Information Technology Application - Volume 03
SVM-RFE with relevancy and redundancy criteria for gene selection
PRIB'07 Proceedings of the 2nd IAPR international conference on Pattern recognition in bioinformatics
The feature selection problem: traditional methods and a new algorithm
AAAI'92 Proceedings of the tenth national conference on Artificial intelligence
Estimating redundancy information of selected features in multi-dimensional pattern classification
Pattern Recognition Letters
Pattern Recognition Letters
A sparse nearest mean classifier for high dimensional multi-class problems
Pattern Recognition Letters
Feature subset selection using differential evolution and a statistical repair mechanism
Expert Systems with Applications: An International Journal
Feature evaluation and selection with cooperative game theory
Pattern Recognition
The Journal of Machine Learning Research
A probabilistic heuristic for a computationally difficult set covering problem
Operations Research Letters
Hi-index | 0.10 |
Feature selection is a central problem in machine learning and pattern recognition. On large datasets (in terms of dimension and/or number of instances), using search-based or wrapper techniques can be computationally prohibitive. Moreover, many filter methods based on relevance/redundancy assessment also take a prohibitively long time on high-dimensional datasets. In this paper, we propose efficient unsupervised and supervised feature selection/ranking filters for high-dimensional datasets. These methods use low-complexity relevance and redundancy criteria, applicable to supervised, semi-supervised, and unsupervised learning, being able to act as pre-processors for computationally intensive methods to focus their attention on smaller subsets of promising features. The experimental results, with up to 10^5 features, show the time efficiency of our methods, with lower generalization error than state-of-the-art techniques, while being dramatically simpler and faster.