The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Efficient computations for large least square support vector machine classifiers
Pattern Recognition Letters
SMO algorithm for least-squares SVM formulations
Neural Computation
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
Multicategory Proximal Support Vector Machine Classifiers
Machine Learning
Leave-One-Out Bounds for Support Vector Regression Model Selection
Neural Computation
Working Set Selection Using Second Order Information for Training Support Vector Machines
The Journal of Machine Learning Research
Model selection for the LS-SVM. Application to handwriting recognition
Pattern Recognition
Optimized fixed-size kernel models for large data sets
Computational Statistics & Data Analysis
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Improved neural network for SVM learning
IEEE Transactions on Neural Networks
Pruning error minimization in least squares support vector machines
IEEE Transactions on Neural Networks
Support vector machine with adaptive parameters in financial time series forecasting
IEEE Transactions on Neural Networks
An improved conjugate gradient scheme to the solution of least squares SVM
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
SMO-based pruning methods for sparse least squares support vector machines
IEEE Transactions on Neural Networks
Fast Sparse Approximation for Least Squares Support Vector Machine
IEEE Transactions on Neural Networks
A Convex Approach to Validation-Based Learning of the Regularization Constant
IEEE Transactions on Neural Networks
Working Set Selection Using Functional Gain for LS-SVM
IEEE Transactions on Neural Networks
Semisupervised Learning Using Bayesian Interpretation: Application to LS-SVM
IEEE Transactions on Neural Networks
Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions
Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions
Hi-index | 0.09 |
We propose a novel least squares support vector machine, named @e-least squares support vector machine (@e-LSSVM), for binary classification. By introducing the @e-insensitive loss function instead of the quadratic loss function into LSSVM, @e-LSSVM has several improved advantages compared with the plain LSSVM. (1) It has the sparseness which is controlled by the parameter @e. (2) By weighting different sparseness parameters @e for each class, the unbalanced problem can be solved successfully, furthermore, an useful choice of the parameter @e is proposed. (3) It is actually a kind of @e-support vector regression (@e-SVR), the only difference here is that it takes the binary classification problem as a special kind of regression problem. (4) Therefore it can be implemented efficiently by the sequential minimization optimization (SMO) method for large scale problems. Experimental results on several benchmark datasets show the effectiveness of our method in sparseness, balance performance and classification accuracy, and therefore confirm the above conclusion further.