An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Pattern Recognition with Fuzzy Objective Function Algorithms
Pattern Recognition with Fuzzy Objective Function Algorithms
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Support vector machine with adaptive parameters in financial time series forecasting
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Inspired by the so-called “divide-and-conquer” principle that is often used to attack a complex problem by dividing it into simpler problems, a two-stage multiple support vector machines (SVMs) architecture is proposed to improve its prediction accuracy and generalization performance for chaotic time series prediction. Fuzzy C-means (FCM) clustering algorithm is adopted in the first stage to partition the input dataset into several subnets. Then, in the second stage, multiple SVMs that best fit partitioned subsets are constructed by Gaussian radial basis function kernel and the optimal free parameters of SVMs. All the models are evaluated by Mackey-Glass chaotic time series and used for coal mine gas concentration in the experiment. The simulation shows that the multiple SVMs achieve significant improvement in the generalization performance in comparison with the single SVM model. In addition, the multiple SVMs also converges faster and uses fewer support vectors.