Fast time series classification using numerosity reduction
ICML '06 Proceedings of the 23rd international conference on Machine learning
Best subset feature selection for massive mixed-type problems
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
Automatically finding the control variables for complex system behavior
Automated Software Engineering
A time series forest for classification and feature extraction
Information Sciences: an International Journal
The influence of global constraints on similarity measures for time-series databases
Knowledge-Based Systems
Hi-index | 0.00 |
The paper investigates a generic method of time series classification that is invariant to transformations of time axis. The state-of-art methods widely use Dynamic Time Warping (DTW) with One-Nearest-Neighbor (1NN). We use DTW to transform time axis of each signal in order to decrease the Euclidean distance between signals from the same class. The predictive accuracy of an algorithm that learns from a heterogeneous set of features extracted from signals is analyzed. Feature selection is used to filter out irrelevant predictors and a serial ensemble of decision trees is used for classification. We simulate a dataset for providing a better insight into the algorithm. We also compare our method to DTW+1NN on several publicly available datasets.