Elements of information theory
Elements of information theory
Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Multicategory Classification by Support Vector Machines
Computational Optimization and Applications - Special issue on computational optimization—a tribute to Olvi Mangasarian, part I
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Machine Learning
Cancer classification using gene expression data
Information Systems - Special issue: Data management in bioinformatics
An improved branch and bound algorithm for feature selection
Pattern Recognition Letters
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Iterative RELIEF for Feature Weighting: Algorithms, Theories, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
A stability index for feature selection
AIAP'07 Proceedings of the 25th conference on Proceedings of the 25th IASTED International Multi-Conference: artificial intelligence and applications
Development of Two-Stage SVM-RFE Gene Selection Strategy for Microarray Expression Data Analysis
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Bioinformatics
Multiclass classification and gene selection with a stochastic algorithm
Computational Statistics & Data Analysis
Linear Models and Generalizations: Least Squares and Alternatives
Linear Models and Generalizations: Least Squares and Alternatives
One-versus-one and one-versus-all multiclass SVM-RFE for gene selection in cancer classification
EvoBIO'07 Proceedings of the 5th European conference on Evolutionary computation, machine learning and data mining in bioinformatics
Local-Learning-Based Feature Selection for High-Dimensional Data Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust relief-feature weighting, margin maximization, and fuzzy optimization
IEEE Transactions on Fuzzy Systems
The feature selection problem: traditional methods and a new algorithm
AAAI'92 Proceedings of the tenth national conference on Artificial intelligence
Using partial least squares and support vector machines for bankruptcy prediction
Expert Systems with Applications: An International Journal
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
PLS-Based Gene Selection and Identification of Tumor-Specific Genes
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.00 |
This paper focused on feature selection for high-dimensional small samples (HDSS). We first presented a general analytical framework for feature selection on a HDSS including selection strategy (single-feature ranking and multi-feature ranking) and evaluation criteria (feature subset consistency and compactness). Then we proposed partial least squares (PLS) based feature selection methods for HDSS and two theorems. The proposed methodologies include a PLS model for classification, parameter selection, PLSRanking, and PLS-based recursive feature elimination. Furthermore, we compared our proposed methods with several existing feature selection methods such as Support Vector Machine (SVM) based feature selection, SVM-based recursive feature elimination (SVMRFE), Random Forest (RF) based feature selection, RF-based recursive feature elimination (RFRFE), ReliefF algorithm and ReliefF-based recursive feature elimination (ReliefFRFE). Using twelve high-dimensional datasets from different areas of research, we evaluated the results in terms of accuracy (sensitivity and specificity), running time, and the feature subset consistency and compactness. The analysis demonstrated that the proposed approach from our research performed very well when handling both two-category and multi-category problems.