The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Introduction to support vector learning
Advances in kernel methods
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Dynamically adapting kernels in support vector machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Clustering by Scale-Space Filtering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Interior-Point Methods for Massive Support Vector Machines
SIAM Journal on Optimization
Optimization of the SVM Kernels Using an Empirical Error Minimization Scheme
SVM '02 Proceedings of the First International Workshop on Pattern Recognition with Support Vector Machines
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Asymptotic behaviors of support vector machines with Gaussian kernel
Neural Computation
Support Vector Machines: Training and Applications
Support Vector Machines: Training and Applications
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
A distributed PSO-SVM hybrid system with feature selection and parameter optimization
Applied Soft Computing
Classification model selection via bilevel programming
Optimization Methods & Software - Mathematical programming in data mining and machine learning
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Support vector machines for spam categorization
IEEE Transactions on Neural Networks
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Two strategies for selecting the kernel parameter (σ) and the penalty coefficient (C) of Gaussian support vector machines (SVMs) are suggested in this paper. Based on viewing the model parameter selection problem as a recognition problem in visual systems, a direct parameter setting formula for the kernel parameter is derived through finding a visual scale at which the global and local structures of the given data set can be preserved in the feature space, and the difference between the two structures can be maximized. In addition, we propose a heuristic algorithm for the selection of the penalty coefficient through identifying the classification extent of a training datum in the implementation process of the sequential minimal optimization (SMO) procedure, which is a well-developed and commonly used algorithm in SVM training. We then evaluate the suggested strategies with a series of experiments on 13 benchmark problems and three real-world data sets, as compared with the traditional 5-cross validation (5-CV) method and the recently developed radius-margin bound (RM) method. The evaluation shows that in terms of efficiency and generalization capabilities, the new strategies outperform the current methods, and the performance is uniform and stable.