Extended analyses for an optimal kernel in a class of kernels with an invariant metric

  • Authors:
  • Akira Tanaka;Ichigaku Takigawa;Hideyuki Imai;Mineichi Kudo

  • Affiliations:
  • Division of Computer Science, Hokkaido University, Sapporo, Japan;Creative Research Institution, Hokkaido University, Sapporo, Japan;Division of Computer Science, Hokkaido University, Sapporo, Japan;Division of Computer Science, Hokkaido University, Sapporo, Japan

  • Venue:
  • SSPR'12/SPR'12 Proceedings of the 2012 Joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Learning based on kernel machines is widely known as a powerful tool for various fields of information science such as pattern recognition and regression estimation. An appropriate model selection is required in order to obtain desirable learning results. In our previous work, we discussed a class of kernels forming a nested class of reproducing kernel Hilbert spaces with an invariant metric and proved that the kernel corresponding to the smallest reproducing kernel Hilbert space, including an unknown true function, gives the best model. In this paper, we relax the invariant metric condition and show that a similar result is obtained when a subspace with an invariant metric exists.