A fast incremental kernel principal component analysis for online feature extraction

  • Authors:
  • Seiichi Ozawa;Yohei Takeuchi;Shigeo Abe

  • Affiliations:
  • Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Engineering, Kobe University, Kobe, Japan

  • Venue:
  • PRICAI'10 Proceedings of the 11th Pacific Rim international conference on Trends in artificial intelligence
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we present a modified version of Incremental Kernel Principal Component Analysis (IKPCA) which was originally proposed by Takeuchi et al. as an online nonlinear feature extraction method. The proposed IKPCA learns a high-dimensional feature space incrementally by solving an eigenvalue problem whose matrix size is given by the power of the number of independent data. In the proposed IKPCA, independent data are used for calculating eigenvectors in a feature space, but they are selected in a low-dimensional eigen-feature space. Hence, the size of an eigenvalue problem is usually small, and this allows IKPCA to learn eigen-feature spaces very fast even though the eigenvalue decomposition has to be carried out at every learning stage. The proposed IKPCA consists of two learning phases: initial learning phase and incremental learning phase. In the former, some parameters are optimized and an initial eigen-feature space is computed by applying the conventional KPCA. In the latter, the eigen-feature space is incrementally updated whenever a new data is given. In the experiments, we evaluate the learning time and the approximation accuracies of eigenvectors and eigenvalues. The experimental results demonstrate that the proposed IKPCA learns eigen-feature spaces very fast with good approximation accuracy.