The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear black-box modeling in system identification: a unified overview
Automatica (Journal of IFAC) - Special issue on trends in system identification
Matrix computations (3rd ed.)
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
SSVM: A Smooth Support Vector Machine for Classification
Computational Optimization and Applications
Time Series Analysis, Forecasting and Control
Time Series Analysis, Forecasting and Control
Text Categorization with Suport Vector Machines: Learning with Many Relevant Features
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Lagrangian support vector machines
The Journal of Machine Learning Research
A Feature Selection Newton Method for Support Vector Machine Classification
Computational Optimization and Applications
epsilon-SSVR: A Smooth Support Vector Machine for epsilon-Insensitive Regression
IEEE Transactions on Knowledge and Data Engineering
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
TSVR: An efficient Twin Support Vector Machine for regression
Neural Networks
On Lagrangian support vector regression
Expert Systems with Applications: An International Journal
On finite Newton method for support vector regression
Neural Computing and Applications
Multi-view laplacian support vector machines
ADMA'11 Proceedings of the 7th international conference on Advanced Data Mining and Applications - Volume Part II
Active set support vector regression
IEEE Transactions on Neural Networks
Multitask multiclass support vector machines: Model and experiments
Pattern Recognition
Multitask twin support vector machines
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Hi-index | 0.00 |
In this paper, a simple reformulation of the Lagrangian dual of the 2-norm support vector regression (SVR) is proposed as an unconstrained minimization problem. This formulation has the advantage that its objective function is strongly convex and further having only m variables, where m is the number of input data points. The proposed unconstrained Lagrangian SVR (ULSVR) is solvable by computing the zeros of its gradient. However, since its objective function contains the non-smooth 'plus' function, two approaches are followed to solve the proposed optimization problem: (i) by introducing a smooth approximation, generate a slightly modified unconstrained minimization problem and solve it; (ii) solve the problem directly by applying generalized derivative. Computational results obtained on a number of synthetic and real-world benchmark datasets showing similar generalization performance with much faster learning speed in accordance with the conventional SVR and training time very close to least squares SVR clearly indicate the superiority of ULSVR solved by smooth and generalized derivative approaches.