The nature of statistical learning theory
The nature of statistical learning theory
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Support vector interval regression networks for interval regression analysis
Fuzzy Sets and Systems - Theme: Learning and modeling
Robust radial basis function neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
The annealing robust backpropagation (ARBP) learning algorithm
IEEE Transactions on Neural Networks
A study on reduced support vector machines
IEEE Transactions on Neural Networks
Reduced Support Vector Machines: A Statistical Theory
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Support Vector Regression (SVR) has been very successful in pattern recognition, text categorization and function approximation. In real application systems, data domain often suffers from noise and outliers. When there is noise and/or outliers existing in sampling data, the SVR may try to fit those improper data and obtained systems may have the phenomenon of overfitting. In addition, the memory space for storing the kernel matrix of SVR will be increment with O (N2), where N is the number of training data. In this paper, a robust support vector regression is proposed for nonlinear function approximation problems with noise and outliers.