Convex Optimization
Cutting plane method for continuously constrained kernel-based regression
IEEE Transactions on Neural Networks
Learning Transformation Models for Ranking and Survival Analysis
The Journal of Machine Learning Research
Multivariate convex support vector regression with semidefinite programming
Knowledge-Based Systems
Nonparametric bivariate copula estimation based on shape-restricted support vector regression
Knowledge-Based Systems
Modeling financial dependence with support vector regression
Intelligent Data Analysis
Hi-index | 0.00 |
This paper considers the estimation of monotone nonlinear regression functions based on Support Vector Machines (SVMs), Least Squares SVMs (LS-SVMs) and other kernel machines. It illustrates how to employ the primal-dual optimization framework characterizing LS-SVMs in order to derive a globally optimal one-stage estimator for monotone regression. As a practical application, this letter considers the smooth estimation of the cumulative distribution functions (cdf), which leads to a kernel regressor that incorporates a Kolmogorov---Smirnoff discrepancy measure, a Tikhonov based regularization scheme and a monotonicity constraint.