The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Convex Optimization
Computing LTS Regression for Large Data Sets
Data Mining and Knowledge Discovery
Variational Graph Embedding for Globally and Locally Consistent Feature Extraction
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
Robustness of Kernel Based Regression: A Comparison of Iterative Weighting Schemes
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Robust least squares support vector machine based on recursive outlier elimination
Soft Computing - A Fusion of Foundations, Methodologies and Applications
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives
Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives
Hybrid robust support vector machines for regression with outliers
Applied Soft Computing
Expert Systems with Applications: An International Journal
Traffic safety forecasting method by particle swarm optimization and support vector machine
Expert Systems with Applications: An International Journal
Optimal Locality Regularized Least Squares Support Vector Machine via Alternating Optimization
Neural Processing Letters
Correntropy based feature selection using binary projection
Pattern Recognition
A regularized correntropy framework for robust pattern recognition
Neural Computation
Maximum Correntropy Criterion for Robust Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Smooth twin support vector regression
Neural Computing and Applications
Correntropy: Properties and Applications in Non-Gaussian Signal Processing
IEEE Transactions on Signal Processing
Robust support vector regression networks for function approximation with outliers
IEEE Transactions on Neural Networks
Robust Principal Component Analysis Based on Maximum Correntropy Criterion
IEEE Transactions on Image Processing
Hi-index | 0.01 |
Least squares support vector machine for regression (LSSVR) is an efficient method for function estimation problem. However, its solution is prone to large noise and outliers since it depends on the minimum of the sum of squares error (SSE) on training samples. To tackle this problem, in this paper, a novel regression model termed as recursive robust LSSVR (R^2LSSVR) is proposed to obtain robust estimation for data in the presence of outliers. The idea is to build a regression model in the kernel space based on maximum correntropy criterion and regularization technique. An iterative algorithm derived from half-quadratic optimization is further developed to solve R^2LSSVR with theoretically guaranteed convergence. It also reveals that R^2LSSVR is closely related to the original LSSVR since it essentially solves adaptive weighted LSSVR iteratively. Furthermore, a hyperparameters selection method for R^2LSSVR is presented based on particle swarm optimization (PSO) such that multiple hyperparameters in R^2LSSVR can be estimated effectively for better performance. The feasibility of this method is examined on some simulated and benchmark datasets. The experimental results demonstrate the good robust performance of the proposed method.