The nature of statistical learning theory
The nature of statistical learning theory
Algorithmic stability and sanity-check bounds for leave-one-out cross-validation
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Estimating the Generalization Performance of an SVM Efficiently
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Nonlinear Speech Model Based on Support Vector Machine and Wavelet Transform
ICTAI '03 Proceedings of the 15th IEEE International Conference on Tools with Artificial Intelligence
A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik-Chervonenkis Dimension
IEEE Transactions on Knowledge and Data Engineering
Wavelet support vector machine
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
Recently, SVMs are wildly applied to regression estimation, but the existing algorithms leave the choice of the kernel type and kernel parameters to the user. This is the main reason for regression performance degradation, especially for the complicated data even the nonlinear and non-stationary data. By introducing the ‘empirical mode decomposition (EMD)’ method, with which any complicated data set can be decomposed into a finite and often small number of ‘intrinsic mode functions’ (IMFs) based on the local characteristic time scale of the data, this paper propose an important extension to the SVM method: multi-scale support vector machine based on EMD, in which several kernels of different scales can be used simultaneously to approximate the target function in different scales. Experiment results demonstrate the effectiveness of the proposed method.