An Experimental and Theoretical Comparison of Model SelectionMethods
Machine Learning - Special issue on the eighth annual conference on computational learning theory, (COLT '95)
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Machine Learning
Stochastic Complexity in Statistical Inquiry Theory
Stochastic Complexity in Statistical Inquiry Theory
An introduction to variable and feature selection
The Journal of Machine Learning Research
Dimensionality reduction via sparse support vector machines
The Journal of Machine Learning Research
Variable selection using svm based criteria
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combined SVM-Based Feature Selection and Classification
Machine Learning
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Competitive baseline methods set new standards for the NIPS 2003 feature selection benchmark
Pattern Recognition Letters
Improved feature reduction in input and feature spaces
Pattern Recognition
Information criteria for support vector machines
IEEE Transactions on Neural Networks
An Adaptive Wavelet Networks Algorithm for Prediction of Gas Delay Outburst
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part III
Model Selection: Beyond the Bayesian/Frequentist Divide
The Journal of Machine Learning Research
An optimal set of uncorrelated margin discriminant vector
ICNC'09 Proceedings of the 5th international conference on Natural computation
Feature selection for support vector machines with RBF kernel
Artificial Intelligence Review
Hi-index | 0.00 |
Support vector machines for classification have the advantage that the curse of dimensionality is circumvented. It has been shown that a reduction of the dimension of the input space leads to even better results. For this purpose, we propose two information criteria which can be computed directly from the definition of the support vector machine. We assess the predictive performance of the models selected by our new criteria and compare them to existing variable selection techniques in a simulation study. The simulation results show that the new criteria are competitive in terms of generalization error rate while being much easier to compute. We arrive at the same findings for comparison on some real-world benchmark data sets.