The nature of statistical learning theory
The nature of statistical learning theory
SIAM Review
Improved generalization via tolerant training
Journal of Optimization Theory and Applications
Lectures on modern convex optimization: analysis, algorithms, and engineering applications
Lectures on modern convex optimization: analysis, algorithms, and engineering applications
Large Scale Kernel Regression via Linear Programming
Machine Learning
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Convex Optimization
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
A tutorial on support vector regression
Statistics and Computing
Primal-Dual Monotone Kernel Regression
Neural Processing Letters
Nonparametric Quantile Estimation
The Journal of Machine Learning Research
Incorporating prior knowledge in support vector regression
Machine Learning
Classification of multivariate time series using two-dimensional singular value decomposition
Knowledge-Based Systems
Classification of multivariate time series using locality preserving projections
Knowledge-Based Systems
Cutting plane method for continuously constrained kernel-based regression
IEEE Transactions on Neural Networks
A support vector machine-based model for detecting top management fraud
Knowledge-Based Systems
A new fuzzy support vector machine to evaluate credit risk
IEEE Transactions on Fuzzy Systems
Nonlinear Knowledge in Kernel Approximation
IEEE Transactions on Neural Networks
Improving prediction of exchange rates using Differential EMD
Expert Systems with Applications: An International Journal
Nonparametric bivariate copula estimation based on shape-restricted support vector regression
Knowledge-Based Systems
Multivariate convex regression with adaptive partitioning
The Journal of Machine Learning Research
Hi-index | 0.00 |
As one of important nonparametric regression method, support vector regression can achieve nonlinear capability by kernel trick. This paper discusses multivariate support vector regression when its regression function is restricted to be convex. This paper approximates this convex shape restriction with a series of linear matrix inequality constraints and transforms its training to a semidefinite programming problem, which is computationally tractable. Extensions to multivariate concave case, @?"2-norm Regularization, @?"1 and @?"2-norm loss functions, are also studied in this paper. Experimental results on both toy data sets and a real data set clearly show that, by exploiting this prior shape knowledge, this method can achieve better performance than the classical support vector regression.