The nature of statistical learning theory
The nature of statistical learning theory
Solving regression problems with rule-based ensemble classifiers
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
An Improved Support Vector Regression Based on Classification
MUE '07 Proceedings of the 2007 International Conference on Multimedia and Ubiquitous Engineering
Adaptive local hyperplane for regression tasks
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Regression based on support vector classification
ICANNGA'11 Proceedings of the 10th international conference on Adaptive and natural computing algorithms - Volume Part II
Hi-index | 0.01 |
In this article, we provide some preliminary theoretical analysis and extended practical experiments of a novel regression method proposed recently which is based on representing regression problems as classification ones with duplicated and shifted data. The main results regard partial equivalency of Bayes solutions for regression problems and the transformed classification ones, and improved Vapnik-Chervonenkis bounds for the proposed method compared to Support Vector Machines. We conducted experiments comparing the proposed method with @e-insensitive Support Vector Regression (@e-SVR) on various synthetic and real world data sets. The results indicate that the new method can achieve comparable generalization performance as @e-SVR with significantly improved the number of support vectors.