Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Pattern Classification: Neuro-Fuzzy Methods and Their Comparison
Pattern Classification: Neuro-Fuzzy Methods and Their Comparison
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Dimensionality reduction via sparse support vector machines
The Journal of Machine Learning Research
Grafting: fast, incremental feature selection by gradient descent in function space
The Journal of Machine Learning Research
Variable selection using svm based criteria
The Journal of Machine Learning Research
Use of the zero norm with linear models and kernel methods
The Journal of Machine Learning Research
Efficient Calculation of the Complete Optimal Classification Set
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 2 - Volume 02
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Neural Networks - 2005 Special issue: IJCNN 2005
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Pattern Recognition, Third Edition
Pattern Recognition, Third Edition
FS_SFS: A novel feature selection method for support vector machines
Pattern Recognition
Feature selection based on kernel discriminant analysis
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
Evaluation of feature selection by multiclass kernel discriminant analysis
ANNPR'10 Proceedings of the 4th IAPR TC3 conference on Artificial Neural Networks in Pattern Recognition
Hi-index | 0.01 |
For two-class problems we propose two feature selection criteria based on kernel discriminant analysis (KDA). The first one is the objective function of kernel discriminant analysis called the KDA criterion. We show that the KDA criterion is monotonic for the deletion of features, which ensures stable feature selection. The second one is the recognition rate obtained by a KDA classifier, called the KDA-based recognition rate, which is defined in the one-dimensional space obtained by KDA. Namely, a conditional probability of a datum for a given class is calculated and the datum is classified into the class with the maximum conditional probability. To ensure stable feature selection, we evaluate the KDA-based recognition rate by cross-validation. By computer experiments we compare the two criteria for two-class problems and the recognition rate of the support vector Machine (SVM) evaluated by cross-validation, called the SVM-based recognition rate. The selection performance of the KDA criterion and the KDA-based recognition rate is comparable and is better than that by the SVM-based recognition rate.