System identification (2nd ed.): theory for the user
System identification (2nd ed.): theory for the user
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Orthogonal series density estimation and the kernel eigenvalue problem
Neural Computation
The evidence framework applied to classification networks
Neural Computation
Time series prediction using support vector machines: a survey
IEEE Computational Intelligence Magazine
IITA'09 Proceedings of the 3rd international conference on Intelligent information technology application
Hi-index | 0.00 |
Based on the Nyström approximation and the primal-dual formulation of Least Squares Support Vector Machines (LS-SVM), it becomes possible to apply a nonlinear model to a large scale regression problem. This is done by using a sparse approximation of the nonlinear mapping induced by the kernel matrix, with an active selection of support vectors based on quadratic Renyi entropy criteria. The methodology is applied to the case of load forecasting as an example of a real-life large scale problem in industry, for the case of 24-hours ahead predictions. The results are reported for different number of initial support vectors, which cover between 1% and 4% of the entire sample, with satisfactory results.