Nonlinear complementarity as unconstrained and constrained minimization
Mathematical Programming: Series A and B - Special issue: Festschrift in Honor of Philip Wolfe part II: studies in nonlinear programming
The nature of statistical learning theory
The nature of statistical learning theory
Parallel Gradient Distribution in Unconstrained Optimization
SIAM Journal on Control and Optimization
Matrix computations (3rd ed.)
Using support vector machines for time series prediction
Advances in kernel methods
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Optimal control by least squares support vector machines
Neural Networks
SSVM: A Smooth Support Vector Machine for Classification
Computational Optimization and Applications
Time Series Analysis, Forecasting and Control
Time Series Analysis, Forecasting and Control
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Lagrangian support vector machines
The Journal of Machine Learning Research
epsilon-SSVR: A Smooth Support Vector Machine for epsilon-Insensitive Regression
IEEE Transactions on Knowledge and Data Engineering
Bankruptcy prediction using support vector machine with optimal choice of kernel function parameters
Expert Systems with Applications: An International Journal
On Lagrangian support vector regression
Expert Systems with Applications: An International Journal
On finite Newton method for support vector regression
Neural Computing and Applications
Error tolerance based support vector machine for regression
Neurocomputing
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Active set support vector regression
IEEE Transactions on Neural Networks
Smooth Newton method for implicit Lagrangian twin support vector regression
International Journal of Knowledge-based and Intelligent Engineering Systems
Hi-index | 0.00 |
In this paper a finite Newton iterative method of solution for solving the implicit Lagrangian Support Vector Regression SVR formulation has been proposed. Unlike solving a quadratic programming problem for the case of the standard SVR the solution of the proposed method is obtained by solving a system of linear equations at each iteration of the algorithm. For the linear or nonlinear SVR the finite termination of the proposed method has been established. The algorithm converges from any starting point and does not need any optimization packages. Experiments have been performed on a number of interesting synthetic and real-world datasets. The results obtained by the proposed method are compared with the standard SVR. Similar or better generalization performance of the proposed method clearly demonstrates its effectiveness and applicability.