GDTW-P-SVMs: Variable-length time series analysis using support vector machines

  • Authors:
  • Arash Jalalian;Stephan K. Chalup

  • Affiliations:
  • The University of Newcastle, School of Electrical Engineering and Computer Science, Callaghan, NSW 2308, Australia;The University of Newcastle, School of Electrical Engineering and Computer Science, Callaghan, NSW 2308, Australia

  • Venue:
  • Neurocomputing
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

We describe a new technique for sequential data analysis, called GDTW-P-SVMs. It is a maximum margin method for the construction of classifiers with variable-length input series. It employs potential support vector machines (P-SVMs) and Gaussian Dynamic Time Warping (GDTW) to waive the fixed-length restriction of feature vectors in training and test data. As a result, GDTW-P-SVMs enjoy the P-SVM method's properties such as the ability to: (i) handle data and kernel matrices that are neither positive definite nor square and (ii) minimise a scale-invariant capacity measure. The new technique elaborates on the P-SVM kernel functions, by utilising the well-known dynamic time warping algorithm to provide an elastic distance measure for the kernel functions. Benchmarks for classification are performed with several real-world data sets from the UCR time series classification/clustering page, the GeoLife trajectory data set, and the UCI Machine Learning Repository. The data sets include data with both variable and fixed-length input series. The results show that the new method performs significantly better than the benchmarked standard classification methods.