Scaling kernels: a new least squares support vector machine kernel for approximation

  • Authors:
  • Mu Xiangyang;Zhang Taiyi;Zhou Yatong

  • Affiliations:
  • Dept. of Information and Commun. Engineering, Xi'an Jiaotong University, Xi'an, China;Dept. of Information and Commun. Engineering, Xi'an Jiaotong University, Xi'an, China;School of Information Engineering, Hebei University of Technology, Tianjin, China

  • Venue:
  • MICAI'07 Proceedings of the artificial intelligence 6th Mexican international conference on Advances in artificial intelligence
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Support vector machines(SVM) have been introduced for pattern recognition and regression. But it was limited by the time consuming and the choice of kernel function in practical application. Motivated by the theory of multi-scale representations of signals and wavelet transforms, this paper presents a way for building a wavelet-based reproducing kernel Hilbert spaces (RKHS) which is a multiresolution scale subspace and its associate scaling kernel for least squares support vector machines (LS-SVM). The scaling kernel is constructed by using a scaling function with its different dilations and translations. Results on several approximation problems illustrate that the LS-SVM with scaling kernel can approximate arbitrary signal with multi-scale and owns better approximation performance.