Adaptive floating search methods in feature selection
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
Pattern Classification: Neuro-Fuzzy Methods and Their Comparison
Pattern Classification: Neuro-Fuzzy Methods and Their Comparison
Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Grafting: fast, incremental feature selection by gradient descent in function space
The Journal of Machine Learning Research
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimination
Computational Statistics & Data Analysis
FS_SFS: A novel feature selection method for support vector machines
Pattern Recognition
Kernel discriminant analysis based feature selection
Neurocomputing
Feature Selection with Kernel Class Separability
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection based on kernel discriminant analysis
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
Sparse gaussian processes using backward elimination
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
A novel approach to feature selection based on analysis of class regions
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
In this paper, we propose and evaluate the feature selection criterion based on kernel discriminant analysis (KDA) for multiclass problems, which finds the number of classes minus one eigenvectors. The selection criterion is the sum of the objective function of KDA, namely the sum of eigenvalues associated with the eigenvectors. In addition to the KDA criterion, we propose a new selection criterion that replaces the between-class scatter in KDA with the sum of square distances between all pairs of classes. To speed up backward feature selection, we introduce block deletion, which deletes many features at a timeC and to enhance generalization ability of the selected features we use cross-validation as a stopping condition. By computer experiments using benchmark datasets, we show that the KDA criterion has performance comparable with that of the selection criterion based on the SVM-based recognition rate with cross-validation and can reduce computational cost. We also show that the KDA criterion can terminate feature selection stably using cross-validation as a stopping condition.