Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
A Feature Selection Newton Method for Support Vector Machine Classification
Computational Optimization and Applications
Exact 1-Norm Support Vector Machines Via Unconstrained Convex Differentiable Minimization
The Journal of Machine Learning Research
Support vector machines with adaptive Lq penalty
Computational Statistics & Data Analysis
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Successive overrelaxation for support vector machines
IEEE Transactions on Neural Networks
Improvements to the SMO algorithm for SVM regression
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, a novel smoothing function method for the 1-norm Support Vector Regression (SVR for short) is proposed and an attempt to overcome some drawbacks of former method which are complex, subtle, and sometimes difficult to implement. The model of smoothing Support Vector Machine (SVM) based on 1-norm is provided from the optimization problem, yet it is discrete programming. With the smoothing technique and optimality knowledge, the discrete programming is changed into a continuous programming. Experimental results show that the algorithm is easy to implement and this method is fast and insensitive to initial point. Theory analysis illustrate that smoothing function method for 1-norm SVM are feasible and effective.