The nature of statistical learning theory
The nature of statistical learning theory
Proximal support vector machine classifiers
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Multicategory Proximal Support Vector Machine Classifiers
Machine Learning
Fuzzy multi-category proximal support vector classification via generalized eigenvalues
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Twin Support Vector Machines for Pattern Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
A rough margin based support vector machine
Information Sciences: an International Journal
Text classification based on multi-word with support vector machine
Knowledge-Based Systems
Nonparallel plane proximal classifier
Signal Processing
Least squares twin support vector machines for pattern classification
Expert Systems with Applications: An International Journal
Rapid and brief communication: Rough support vector clustering
Pattern Recognition
TSVR: An efficient Twin Support Vector Machine for regression
Neural Networks
A ν-twin support vector machine (ν-TSVM) classifier and its geometric algorithms
Information Sciences: an International Journal
A rough margin-based ν-twin support vector machine
Neural Computing and Applications - Special Issue on LSMS2010 and ICSEE 2010
Nonparametric bivariate copula estimation based on shape-restricted support vector regression
Knowledge-Based Systems
Twin least squares support vector regression
Neurocomputing
Hi-index | 0.00 |
Twin support vector regression (TSVR) is a new regression algorithm, which aims at finding @e-insensitive up- and down-bound functions for the training points. In order to do so, one needs to resolve a pair of smaller-sized quadratic programming problems (QPPs) rather than a single large one in a classical SVR. However, the same penalties are given to the samples in TSVR. In fact, samples in the different positions have different effects on the bound function. Then, we propose a weighted TSVR in this paper, where samples in the different positions are proposed to give different penalties. The final regressor can avoid the over-fitting problem to a certain extent and yield great generalization ability. Numerical experiments on one artificial dataset and nine benchmark datasets demonstrate the feasibility and validity of our proposed algorithm.