Multiple-ν support vector regression based on spectral risk measure minimization

  • Authors:
  • Yongqiao Wang;He Ni;Shouyang Wang

  • Affiliations:
  • School of Finance, Zhejiang Gongshang University, Hangzhou, Zhejiang, 310018, China;School of Finance, Zhejiang Gongshang University, Hangzhou, Zhejiang, 310018, China;Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100080, China

  • Venue:
  • Neurocomputing
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

Statistical learning theory provides the justification of the @e-insensitive loss in support vector regression, but suggests little guidance on the determination of the critical hyper-parameter @e. Instead of predefining @e, @n-support vector regression automatically selects @e by making the percent of deviations larger than @e be asymptotically equal to @n. In stochastic programming terminology, the goal of @n-support vector regression is to minimize the conditional Value-at-Risk measure of deviations, i.e. the expectation of the larger @n-percent deviations. This paper tackles the determination of the critical hyper-parameter @n in @n-support vector regression when the error term follows a complex distribution. Instead of one singleton @n, the paper assumes @n to be a combination of multiple, finite or infinite, candidate choices. Thus, the cost function becomes a weighted sum of component conditional value-at-risk measures associated with these base @ns. This paper shows that this cost function can be represented with a spectral risk measure and its minimization can be reformulated to a linear programming problem. Experiments on three artificial data sets show that this multiple-@n support vector regression has great advantage over the classical @n-support vector regression when the error terms follow mixed polynomial distributions. Experiments on 10 real-world data sets also clearly demonstrate that this new method can achieve better performance than @e-support vector regression and @n-support vector regression.