Machine Learning
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Feature Selection Via Mathematical Programming
INFORMS Journal on Computing
An introduction to variable and feature selection
The Journal of Machine Learning Research
Variable selection using svm based criteria
The Journal of Machine Learning Research
Use of the zero norm with linear models and kernel methods
The Journal of Machine Learning Research
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
A Maximum Class Distance Support Vector Machine-Based Algorithm for Recursive Dimension Reduction
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part II
Evaluation of feature selection by multiclass kernel discriminant analysis
ANNPR'10 Proceedings of the 4th IAPR TC3 conference on Artificial Neural Networks in Pattern Recognition
Hi-index | 0.03 |
Variable selection serves a dual purpose in statistical classification problems: it enables one to identify the input variables which separate the groups well, and a classification rule based on these variables frequently has a lower error rate than the rule based on all the input variables. Kernel Fisher discriminant analysis (KFDA) is a recently proposed powerful classification procedure, frequently applied in cases characterised by large numbers of input variables. The important problem of eliminating redundant input variables before implementing KFDA is addressed in this paper. A backward elimination approach is recommended, and two criteria which can be used for recursive elimination of input variables are proposed and investigated. Their performance is evaluated on several data sets and in a simulation study.