Theoretically Optimal Parameter Choices for Support Vector Regression Machines with Noisy Input

  • Authors:
  • Wang Shitong;Zhu Jiagang;F. L. Chung;Lin Qing;Hu Dewen

  • Affiliations:
  • School of Information Engineering, Southern Yangtze University, Wuxi, China;Dept. of Comp. Sci. and Engineering, Nanjing Univ. of Sci. and Tech., Nanjing, China;Dept. Computing, HongKong Polytechnic University, HongKong, China;School of automation, National Defense Univ. of Sci. and Tech., Changsha, China;Lab. of Comp. Sci., Institute of Software, Chinese Academy of Science, Changsha, China

  • Venue:
  • Soft Computing - A Fusion of Foundations, Methodologies and Applications
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

With the evidence framework, the regularized linear regression model can be explained as the corresponding MAP problem in this paper, and the general dependency relationships that the optimal parameters in this model with noisy input should follow is then derived. The support vector regression machines Huber-SVR and Norm-r r-SVR are two typical examples of this model and their optimal parameter choices are paid particular attention. It turns out that with the existence of the typical Gaussian noisy input, the parameter μ in Huber-SVR has the linear dependency with the input noise, and the parameter r in the r-SVR has the inversely proportional to the input noise. The theoretical results here will be helpful for us to apply kernel-based regression techniques effectively in practical applications.