The nature of statistical learning theory
The nature of statistical learning theory
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
The bias-variance tradeoff and the randomized GACV
Proceedings of the 1998 conference on Advances in neural information processing systems II
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Machine Learning
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Efficient computations for large least square support vector machine classifiers
Pattern Recognition Letters
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
Feature Selection for Support Vector Machines by Means of Genetic Algorithms
ICTAI '03 Proceedings of the 15th IEEE International Conference on Tools with Artificial Intelligence
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
Neural Computing and Applications
Leave-One-Out Bounds for Support Vector Regression Model Selection
Neural Computation
Gradient-Based Adaptation of General Gaussian Kernels
Neural Computation
Text classification: A least square support vector machine approach
Applied Soft Computing
Bounds on Error Expectation for Support Vector Machines
Neural Computation
Evolutionary tuning of multiple SVM parameters
Neurocomputing
Automatic model selection for the optimization of SVM kernels
Pattern Recognition
The particle swarm - explosion, stability, and convergence in amultidimensional complex space
IEEE Transactions on Evolutionary Computation
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
Journal of Computational and Applied Mathematics
FUZZ-IEEE'09 Proceedings of the 18th international conference on Fuzzy Systems
The forecasting model based on modified SVRM and PSO penalizing Gaussian noise
Expert Systems with Applications: An International Journal
An optimal method for prediction and adjustment on byproduct gas holder in steel industry
Expert Systems with Applications: An International Journal
Model selection for least squares support vector regressions based on small-world strategy
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
First and Second Order SMO Algorithms for LS-SVM Classifiers
Neural Processing Letters
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Tuning metaheuristics: A data mining based approach for particle swarm optimization
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Force identification by using SVM and CPSO technique
ICSI'10 Proceedings of the First international conference on Advances in Swarm Intelligence - Volume Part II
A PSO-SVM based model for alpha particle activity prediction inside decommissioned channels
ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
A hybrid PSO-FSVM model and its application to imbalanced classification of mammograms
ACIIDS'13 Proceedings of the 5th Asian conference on Intelligent Information and Database Systems - Volume Part I
Classification of EMG signals using PSO optimized SVM for diagnosis of neuromuscular disorders
Computers in Biology and Medicine
Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Hi-index | 0.02 |
The selection of hyper-parameters plays an important role to the performance of least-squares support vector machines (LS-SVMs). In this paper, a novel hyper-parameter selection method for LS-SVMs is presented based on the particle swarm optimization (PSO). The proposed method does not need any priori knowledge on the analytic property of the generalization performance measure and can be used to determine multiple hyper-parameters at the same time. The feasibility of this method is examined on benchmark data sets. Different kinds of kernel families are investigated by using the proposed method. Experimental results show that the best or quasi-best test performance could be obtained by using the scaling radial basis kernel function (SRBF) and RBF kernel functions, respectively.