Encoding a priori information in feedforward networks
Neural Networks
Automatica (Journal of IFAC)
Relaxed cutting plane method for solving linear semi-infinite programming problems
Journal of Optimization Theory and Applications
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
A tutorial on support vector regression
Statistics and Computing
Knowledge-Based Kernel Approximation
The Journal of Machine Learning Research
Primal-Dual Monotone Kernel Regression
Neural Processing Letters
Incorporating prior knowledge in support vector regression
Machine Learning
Nonlinear Knowledge in Kernel Approximation
IEEE Transactions on Neural Networks
Multivariate convex support vector regression with semidefinite programming
Knowledge-Based Systems
Nonparametric bivariate copula estimation based on shape-restricted support vector regression
Knowledge-Based Systems
Modeling financial dependence with support vector regression
Intelligent Data Analysis
Hi-index | 0.00 |
Incorporating constraints into the kernel-based regression is an effective means to improve regression performance. Nevertheless, in many applications, the constraints are continuous with respect to some parameters so that computational difficulties arise. Discretizing the constraints is a reasonable solution for these difficulties. However, in the context of kernel-based regression, most of existing works utilize the prior discretization strategy; this strategy suffers from a few inherent deficiencies: it cannot ensure that the regression result totally fulfills the original constraints and can hardly tackle high-dimensional problems. This paper proposes a cutting plane method (CPM) for constrained kernel-based regression problems and a relaxed CPM (R-CPM) for high-dimensional problems. The CPM discretizes the continuous constraints iteratively and ensures that the regression result strictly fulfills the original constraints. For high-dimensional problems, the R-CPM accepts a slight and controlled violation to attain a dimensional-independent computational complexity. The validity of the proposed methods is verified by numerical experiments.