The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
PDA-SVM Hybrid: A Unified Model for Kernel-Based Supervised Classification
Journal of Signal Processing Systems
Hi-index | 0.00 |
In the kernel approach, any N vectorial or non-vectorial data can be converted to N vectors with feature dimension N. The promise of the kernel approach hinges upon its representation vector space, leading to a "cornerized" data structure. Furthermore, the nonsingular kernel matrix basically assures a theoretically linear separability, critical to supervised learning. The main results are two folds: In terms of unsupervised clustering, the kernel approach allows dimension reduction in the spectral space and, moreover, a simple error analysis for the fast kernel K-means. As to supervised classification, by imposing uncorrelated perturbation to the training vector in the spectral space, a perturbed (Fisher) discriminant analysis (PDA) is proposed. This ultimately leads to a hybrid classier which includes PDA and SVM as specials cases, thus offering more flexibility for improving the prediction performance.