A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Journal of Global Optimization
Linear Dependency between epsilon and the Input Noise in epsilon-Support Vector Regression
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
A Stochastic Optimization Approach for Parameter Tuning of Support Vector Machines
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 4 - Volume 04
Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series)
Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series)
Exploring dynamic self-adaptive populations in differential evolution
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Expert Systems with Applications: An International Journal
An expert diagnosis system for classification of human parasite eggs based on multi-class SVM
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
A SVM-based discretization method with application to associative classification
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
A hybrid quantum chaotic swarm evolutionary algorithm for DNA encoding
Computers & Mathematics with Applications
Expert Systems with Applications: An International Journal
Multi-objective rule mining using a chaotic particle swarm optimization algorithm
Knowledge-Based Systems
Power load forecasts based on hybrid PSO with Gaussian and adaptive mutation and Wv-SVM
Expert Systems with Applications: An International Journal
Power load forecasting using support vector machine and ant colony optimization
Expert Systems with Applications: An International Journal
Model optimization of SVM for a fermentation soft sensor
Expert Systems with Applications: An International Journal
Journal of Computational and Applied Mathematics
Evolutionary tuning of multiple SVM parameters
Neurocomputing
Expert Systems with Applications: An International Journal
A 2-Opt based differential evolution for global optimization
Applied Soft Computing
Differential evolution in constrained numerical optimization: An empirical study
Information Sciences: an International Journal
IEEE Transactions on Evolutionary Computation
Automatic Clustering Using an Improved Differential Evolution Algorithm
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Hybrid parallel chaos optimization algorithm with harmony search algorithm
Applied Soft Computing
Hi-index | 12.05 |
In the past decade, support vector machines (SVMs) have gained the attention of many researchers. SVMs are non-parametric supervised learning schemes that rely on statistical learning theory which enables learning machines to generalize well to unseen data. SVMs refer to kernel-based methods that have been introduced as a robust approach to classification and regression problems, lately has handled nonlinear identification problems, the so called support vector regression. In SVMs designs for nonlinear identification, a nonlinear model is represented by an expansion in terms of nonlinear mappings of the model input. The nonlinear mappings define a feature space, which may have infinite dimension. In this context, a relevant identification approach is the least squares support vector machines (LS-SVMs). Compared to the other identification method, LS-SVMs possess prominent advantages: its generalization performance (i.e. error rates on test sets) either matches or is significantly better than that of the competing methods, and more importantly, the performance does not depend on the dimensionality of the input data. Consider a constrained optimization problem of quadratic programing with a regularized cost function, the training process of LS-SVM involves the selection of kernel parameters and the regularization parameter of the objective function. A good choice of these parameters is crucial for the performance of the estimator. In this paper, the LS-SVMs design proposed is the combination of LS-SVM and a new chaotic differential evolution optimization approach based on Ikeda map (CDEK). The CDEK is adopted in tuning of regularization parameter and the radial basis function bandwith. Simulations using LS-SVMs on NARX (Nonlinear AutoRegressive with eXogenous inputs) for the identification of a thermal process show the effectiveness and practicality of the proposed CDEK algorithm when compared with the classical DE approach.