Kernel Canonical Correlation Analysis and Least Squares Support Vector Machines

  • Authors:
  • Tony Van Gestel;Johan A. K. Suykens;Jos De Brabanter;Bart De Moor;Joos Vandewalle

  • Affiliations:
  • -;-;-;-;-

  • Venue:
  • ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

A key idea of nonlinear Support Vector Machines (SVMs) is to map the inputs in a nonlinear way to a high dimensional feature space, while Mercer's condition is applied in order to avoid an explicit expression for the nonlinear mapping. In SVMs for nonlinear classification a large margin classifier is constructed in the feature space. For regression a linear regressor is constructed in the feature space. Other kernel extensions of linear algorithms have been proposed like kernel Principal Component Analysis (PCA) and kernel Fisher Discriminant Analysis. In this paper, we discuss the extension of linear Canonical Correlation Analysis (CCA) to a kernel CCA with application of the Mercer condition. We also discuss links with single output Least Squares SVM (LS-SVM) Regression and Classification.