A set of new Chebyshev kernel functions for support vector machine pattern classification

  • Authors:
  • Sedat Ozer;Chi H. Chen;Hakan A. Cirpan

  • Affiliations:
  • Electrical & Computer Engineering Department, Rutgers University, 96 Frelinghuysen Rd, CAIP, CORE Building, Piscataway, NJ, 08854-8018, USA;Electrical & Computer Engineering Department, University of Massachusetts, Dartmouth, N. Dartmouth, MA, 02747-2300, USA;Electronics & Communications Engineering Department, Istanbul Technical University, Istanbul, 34469, Turkey

  • Venue:
  • Pattern Recognition
  • Year:
  • 2011

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this study, we introduce a set of new kernel functions derived from the generalized Chebyshev polynomials. The proposed generalized Chebyshev polynomials allow us to derive different kernel functions. By using these polynomial functions, we generalize recently introduced Chebyshev kernel function for vector inputs and, as a result, we obtain a robust set of kernel functions for Support Vector Machine (SVM) classification. Thus in this study, besides clarifying how to apply the Chebyshev kernel functions on vector inputs, we also increase the generalization capability of the previously proposed Chebyshev kernels and show how to derive new kernel functions by using the generalized Chebyshev polynomials. The proposed set of kernel functions provides competitive performance when compared to all other common kernel functions on average for the simulation datasets. The results indicate that they can be used as a good alternative to other common kernel functions for SVM classification in order to obtain better accuracy. Moreover, test results show that the generalized Chebyshev kernel approaches to the minimum support vector number for classification in general.