Cutting plane method for continuously constrained kernel-based regression

  • Authors:
  • Zhe Sun;Zengke Zhang;Huangang Wang;Min Jiang

  • Affiliations:
  • Department of Automation, Tsinghua University, Beijing, China;Department of Automation, Tsinghua University, Beijing, China;Department of Automation, Tsinghua University, Beijing, China;School of Electronics and Information Engineering, Soochow University, Suzhou, China

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Incorporating constraints into the kernel-based regression is an effective means to improve regression performance. Nevertheless, in many applications, the constraints are continuous with respect to some parameters so that computational difficulties arise. Discretizing the constraints is a reasonable solution for these difficulties. However, in the context of kernel-based regression, most of existing works utilize the prior discretization strategy; this strategy suffers from a few inherent deficiencies: it cannot ensure that the regression result totally fulfills the original constraints and can hardly tackle high-dimensional problems. This paper proposes a cutting plane method (CPM) for constrained kernel-based regression problems and a relaxed CPM (R-CPM) for high-dimensional problems. The CPM discretizes the continuous constraints iteratively and ensures that the regression result strictly fulfills the original constraints. For high-dimensional problems, the R-CPM accepts a slight and controlled violation to attain a dimensional-independent computational complexity. The validity of the proposed methods is verified by numerical experiments.