The nature of statistical learning theory
The nature of statistical learning theory
Discovering informative patterns and data cleaning
Advances in knowledge discovery and data mining
Robust Cross-Validation Score Function for Non-linear Function Estimation
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Sparseness of support vector machines
The Journal of Machine Learning Research
On Robustness Properties of Convex Risk Minimization Methods for Pattern Recognition
The Journal of Machine Learning Research
Sparseness vs Estimating Conditional Probabilities: Some Asymptotic Results
The Journal of Machine Learning Research
Sparse least squares support vector training in the reduced empirical feature space
Pattern Analysis & Applications
Computational Statistics & Data Analysis
Robustness of reweighted Least Squares Kernel Based Regression
Journal of Multivariate Analysis
Evolution strategies based adaptive Lp LS-SVM
Information Sciences: an International Journal
A mixed effects least squares support vector machine model for classification of longitudinal data
Computational Statistics & Data Analysis
Geoadditive expectile regression
Computational Statistics & Data Analysis
Weighted kernel Fisher discriminant analysis for integrating heterogeneous data
Computational Statistics & Data Analysis
Robust support vector machine with bullet hole image classification
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.03 |
In the field of classification, the support vector machine (SVM) pursues a large margin between two classes. The margin is usually measured by the minimal distance between two sets, which is related to the hinge loss or the squared hinge loss. However, the minimal value is sensitive to noise and unstable to re-sampling. To overcome this weak point, the expectile value is considered to measure the margin between classes instead of the minimal value. Motivated by the relation between the expectile value and the asymmetric squared loss, an asymmetric least squares SVM (aLS-SVM) is proposed. The proposed aLS-SVM can also be regarded as an extension to the LS-SVM and the L2-SVM. Theoretical analysis and numerical experiments on the aLS-SVM illustrate its insensitivity to noise around the boundary and its stability to re-sampling.