A support vector machine formulation to PCA analysis and its kernel version

  • Authors:
  • J. A.K. Suykens;T. Van Gestel;J. Vandewalle;B. De Moor

  • Affiliations:
  • Dept. of Electr. Eng., Katholieke Univ. Leuven, Heverlee, Belgium;-;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we present a simple and straightforward primal-dual support vector machine formulation to the problem of principal component analysis (PCA) in dual variables. By considering a mapping to a high-dimensional feature space and application of the kernel trick (Mercer theorem), kernel PCA is obtained as introduced by Scholkopf et al. (2002). While least squares support vector machine classifiers have a natural link with the kernel Fisher discriminant analysis (minimizing the within class scatter around targets +1 and -1), for PCA analysis one can take the interpretation of a one-class modeling problem with zero target value around which one maximizes the variance. The score variables are interpreted as error variables within the problem formulation. In this way primal-dual constrained optimization problem interpretations to the linear and kernel PCA analysis are obtained in a similar style as for least square-support vector machine classifiers.