Quadratically Constrained Quadratic Programming for Subspace Selection in Kernel Regression Estimation

  • Authors:
  • Marco Signoretto;Kristiaan Pelckmans;Johan A. Suykens

  • Affiliations:
  • K.U. Leuven, ESAT-SCD, Leuven, Belgium B-3001;K.U. Leuven, ESAT-SCD, Leuven, Belgium B-3001;K.U. Leuven, ESAT-SCD, Leuven, Belgium B-3001

  • Venue:
  • ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this contribution we consider the problem of regression estimation. We elaborate on a framework based on functional analysis giving rise to structured models in the context of reproducing kernel Hilbert spaces. In this setting the task of input selection is converted into the task of selecting functional components depending on one (or more) inputs. In turn the process of learning with embedded selection of such components can be formalized as a convex-concave problem. This results in a practical algorithm that can be implemented as a quadratically constrained quadratic programming (QCQP) optimization problem. We further investigate the mechanism of selection for the class of linear functions, establishing a relationship with LASSO.