Least Squares Support Vector Machine Classifiers
Neural Processing Letters
The bias-variance tradeoff and the randomized GACV
Proceedings of the 1998 conference on Advances in neural information processing systems II
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Machine Learning
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
Feature Selection for Support Vector Machines by Means of Genetic Algorithms
ICTAI '03 Proceedings of the 15th IEEE International Conference on Tools with Artificial Intelligence
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
Bounds on Error Expectation for Support Vector Machines
Neural Computation
Evolutionary tuning of multiple SVM parameters
Neurocomputing
Automatic model selection for the optimization of SVM kernels
Pattern Recognition
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Clustering technique-based least square support vector machine for EEG signal classification
Computer Methods and Programs in Biomedicine
Hi-index | 0.00 |
The determination for hyper-parameters including kernel parameters and the regularization is important to the performance of least squares support vector machines (LS-SVMs). In this paper, the problem of model selection for LS-SVMs is discussed. The particle swarm optimization (PSO) is introduced to select the LS-SVMs hyper-parameters. In the proposed method we do not need to consider the analytic property of the generalization performance measure and the number of hyper-parameters. The feasibility of this method is evaluated on benchmark data sets. Experimental results show that better performance can be obtained. Moreover, different kinds of kernel families are investigated by using the proposed method. Experimental results also show that the best and good test performance could be obtained by using the SRBF and RBF kernel functions, respectively.