Support vector regression using mahalanobis kernels

  • Authors:
  • Yuya Kamada;Shigeo Abe

  • Affiliations:
  • Graduate School of Science and Technology, Kobe University, Kobe, Japan;Graduate School of Science and Technology, Kobe University, Kobe, Japan

  • Venue:
  • ANNPR'06 Proceedings of the Second international conference on Artificial Neural Networks in Pattern Recognition
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In our previous work we have shown that Mahalanobis kernels are useful for support vector classifiers both from generalization ability and model selection speed. In this paper we propose using Mahalanobis kernels for function approximation. We determine the covariance matrix for the Mahalanobis kernel using all the training data. Model selection is done by line search. Namely, first the margin parameter and the error threshold are optimized and then the kernel parameter is optimized. According to the computer experiments for four benchmark problems, estimation performance of a Mahalanobis kernel with a diagonal covariance matrix optimized by line search is comparable to or better than that of an RBF kernel optimized by grid search.