Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Regularized Discriminant Analysis, Ridge Regression and Beyond
The Journal of Machine Learning Research
Exploiting tag and word correlations for improved webpage clustering
SMUC '10 Proceedings of the 2nd international workshop on Search and mining user-generated contents
LSMS/ICSEE'10 Proceedings of the 2010 international conference on Life system modeling and and intelligent computing, and 2010 international conference on Intelligent computing for sustainable energy and environment: Part I
Canonical correlation analysis using within-class coupling
Pattern Recognition Letters
Leveraging Social Bookmarks from Partially Tagged Corpus for Improved Web Page Clustering
ACM Transactions on Intelligent Systems and Technology (TIST)
Hi-index | 0.00 |
A key idea of nonlinear Support Vector Machines (SVMs) is to map the inputs in a nonlinear way to a high dimensional feature space, while Mercer's condition is applied in order to avoid an explicit expression for the nonlinear mapping. In SVMs for nonlinear classification a large margin classifier is constructed in the feature space. For regression a linear regressor is constructed in the feature space. Other kernel extensions of linear algorithms have been proposed like kernel Principal Component Analysis (PCA) and kernel Fisher Discriminant Analysis. In this paper, we discuss the extension of linear Canonical Correlation Analysis (CCA) to a kernel CCA with application of the Mercer condition. We also discuss links with single output Least Squares SVM (LS-SVM) Regression and Classification.