Non-Mercer hybrid kernel for linear programming support vector regression in nonlinear systems identification

  • Authors:
  • Zhao Lu;Jing Sun

  • Affiliations:
  • Department of Electrical Engineering, Tuskegee University, Tuskegee, AL 36088, USA;Department of Naval Architecture and Marine Engineering, University of Michigan, Ann Arbor, MI 48109, USA

  • Venue:
  • Applied Soft Computing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

As a new sparse kernel modeling method, support vector regression (SVR) has been regarded as the state-of-the-art technique for regression and approximation. In [V.N. Vapnik, The Nature of Statistical Learning Theory, second ed., Springer-Verlag, 2000], Vapnik developed the @?-insensitive loss function for the support vector regression as a trade-off between the robust loss function of Huber and one that enables sparsity within the support vectors. The use of support vector kernel expansion provides us a potential avenue to represent nonlinear dynamical systems and underpin advanced analysis. However, in the standard quadratic programming support vector regression (QP-SVR), its implementation is often computationally expensive and sufficient model sparsity cannot be guaranteed. In an attempt to mitigate these drawbacks, this article focuses on the application of the soft-constrained linear programming support vector regression (LP-SVR) with hybrid kernel in nonlinear black-box systems identification. An innovative non-Mercer hybrid kernel is explored by leveraging the flexibility of LP-SVR in choosing the kernel functions. The simulation results demonstrate the ability to use more general kernel function and the inherent performance advantage of LP-SVR to QP-SVR in terms of model sparsity and computational efficiency.