The nature of statistical learning theory
The nature of statistical learning theory
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Symbolization assisted SVM classifier for noisy data
Pattern Recognition Letters
Robust neural-fuzzy method for function approximation
Expert Systems with Applications: An International Journal
Robust radial basis function neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
The annealing robust backpropagation (ARBP) learning algorithm
IEEE Transactions on Neural Networks
A study on reduced support vector machines
IEEE Transactions on Neural Networks
Reduced Support Vector Machines: A Statistical Theory
IEEE Transactions on Neural Networks
Hi-index | 12.06 |
Support vector regression (SVR) has been very successful in pattern recognition, text categorization, and function approximation. The theory of SVR is based on the idea of structural risk minimization. In real application systems, data domain often suffers from noise and outliers. When there is noise and/or outliers exist in sampling data, the SVR may try to fit those improper data, and obtained systems may have the phenomenon of overfitting. In addition, the memory space for storing the kernel matrix of SVR will be increment with O(N^2), where N is the number of training data. Hence, for a large training data set, the kernel matrix cannot be saved in the memory. In this paper, a reduced support vector regression is proposed for nonlinear function approximation problems with noise and outliers. The core idea of this approach is to adopt fuzzy clustering and a robust fuzzy c-means (RFCM) algorithm to reduce the computational time of SVR and greatly mitigates the influence of data noise and outliers.