Support Vector Regression for the simultaneous learning of a multivariate function and its derivatives

  • Authors:
  • Marcelino Lázaro;Ignacio Santamaría;Fernando Pérez-Cruz;Antonio Artés-Rodríguez

  • Affiliations:
  • Departamento de Teoría de la Señal y Comunicaciones, Universidad Carlos III, Leganés 28911, Madrid, Spain;Departamento de Ingeniería de Comunicaciones, Universidad de Cantabria, 39005 Santander, Spain;Departamento de Teoría de la Señal y Comunicaciones, Universidad Carlos III, Leganés 28911, Madrid, Spain and Gatsby Computational Neuroscience Unit, UCL, Alexandra House, 17 Queen ...;Departamento de Teoría de la Señal y Comunicaciones, Universidad Carlos III, Leganés 28911, Madrid, Spain

  • Venue:
  • Neurocomputing
  • Year:
  • 2005

Quantified Score

Hi-index 0.02

Visualization

Abstract

In this paper, the problem of simultaneously approximating a function and its derivatives is formulated within the Support Vector Machine (SVM) framework. First, the problem is solved for a one-dimensional input space by using the @e-insensitive loss function and introducing additional constraints in the approximation of the derivative. Then, we extend the method to multi-dimensional input spaces by a multidimensional regression algorithm. In both cases, to optimize the regression estimation problem, we have derived an iterative re-weighted least squares (IRWLS) procedure that works fast for moderate-size problems. The proposed method shows that using the information about derivatives significantly improves the reconstruction of the function.