The nature of statistical learning theory
The nature of statistical learning theory
Genetic algorithms + data structures = evolution programs (3rd ed.)
Genetic algorithms + data structures = evolution programs (3rd ed.)
Tabu search and finite convergence
Discrete Applied Mathematics
On the Convergence of Tabu Search
Journal of Heuristics
Leave-One-Out Bounds for Support Vector Regression Model Selection
Neural Computation
Preventing Over-Fitting during Model Selection via Bayesian Regularisation of the Hyper-Parameters
The Journal of Machine Learning Research
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Least squares twin support vector machines for pattern classification
Expert Systems with Applications: An International Journal
A study of cross-validation and bootstrap for accuracy estimation and model selection
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Evolutionary tuning of multiple SVM parameters
Neurocomputing
Small-World optimization algorithm for function optimization
ICNC'06 Proceedings of the Second international conference on Advances in Natural Computation - Volume Part II
EvoApplications'12 Proceedings of the 2012t European conference on Applications of Evolutionary Computation
Hi-index | 12.05 |
Model selection plays a key role in the application of support vector machine (SVM). In this paper, a method of model selection based on the small-world strategy is proposed for least squares support vector regression (LS-SVR). In this method, the model selection is treated as a single-objective global optimization problem in which generalization performance measure performs as fitness function. To get better optimization performance, the main idea of depending more heavily on dense local connections in small-world phenomenon is considered, and a new small-world optimization algorithm based on tabu search, called the tabu-based small-world optimization (TSWO), is proposed by employing tabu search to construct local search operator. Therefore, the hyper-parameters with best generalization performance can be chosen as the global optimum based on the powerful search ability of TSWO. Experiments on six complex multimodal functions are conducted, demonstrating that TSWO performs better in avoiding premature of the population in comparison with the genetic algorithm (GA) and particle swarm optimization (PSO). Moreover, the effectiveness of leave-one-out bound of LS-SVM on regression problems is tested on noisy sinc function and benchmark data sets, and the numerical results show that the model selection using TSWO can almost obtain smaller generalization errors than using GA and PSO with three generalization performance measures adopted.