Support vector regression methods for functional data

  • Authors:
  • Noslen Hernández;Rolando J. Biscay;Isneri Talavera

  • Affiliations:
  • Advanced Technologies Applications Center;Institute of Mathematics, Physics and Computation;Advanced Technologies Applications Center

  • Venue:
  • CIARP'07 Proceedings of the Congress on pattern recognition 12th Iberoamerican conference on Progress in pattern recognition, image analysis and applications
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many regression tasks in practice dispose in low gear instance of digitized functions as predictor variables. This has motivated the development of regression methods for functional data. In particular, Naradaya-Watson Kernel (NWK) and Radial Basis Function (RBF) estimators have been recently extended to functional nonparametric regression models. However, these methods do not allow for dimensionality reduction. For this purpose, we introduce Support Vector Regression (SVR) methods for functional data. These are formulated in the framework of approximation in reproducing kernel Hilbert spaces. On this general basis, some of its properties are investigated, emphasizing the construction of nonnegative definite kernels on functional spaces. Furthermore, the performance of SVR for functional variables is shown on a real world benchmark spectrometric data set, as well as comparisons with NWK and RBF methods. Good predictions were obtained by these three approaches, but SVR achieved in addition about 20% reduction of dimensionality.