The nature of statistical learning theory
The nature of statistical learning theory
Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Neural Computation
SVMTorch: support vector machines for large-scale regression problems
The Journal of Machine Learning Research
Trading convexity for scalability
ICML '06 Proceedings of the 23rd international conference on Machine learning
Improvements to the SMO algorithm for SVM regression
IEEE Transactions on Neural Networks
Hi-index | 12.05 |
In this paper, we utilize two @e-insensitive loss functions to construct a non-convex loss function. Based on this non-convex loss function, a robust truncated support vector regression (TSVR) is proposed. In order to solve the TSVR, the concave-convex procedure is used to circumvent this problem though transforming the non-convex problem to a sequence of convex ones. The TSVR owns better robustness to outliers than the classical support vector regression, which makes the TSVR gain advantages in the generalization ability and the number of support vector. Finally, the experiments on the synthetic and real-world benchmark data sets further confirm the effectiveness of our proposed TSVR.