Incremental nonlinear PCA for classification

  • Authors:
  • Byung Joo Kim;Il Kon Kim

  • Affiliations:
  • Youngsan University School of Network and Information Engineering, Korea;Kyungpook National University Department of Computer Science, Korea

  • Venue:
  • PKDD '04 Proceedings of the 8th European Conference on Principles and Practice of Knowledge Discovery in Databases
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

The purpose of this study is to propose a new online and nonlinear PCA(OL-NPCA) method for feature extraction from the incremental data. Kernel PCA(KPCA) is widely used for nonlinear feature extraction, however, it has been pointed out that KPCA has the following problems. First, applying KPCA to patterns requires storing and finding the eigenvectors of a kernel matrix, which is infeasible for a large number of data N. Second problem is that in order to update the eigenvectors with an another data, the whole eigenspace should be recomputed. OL-NPCA overcomes these problems by incremental eigenspace update method with a feature mapping function. According to the experimental results, which comes from applying OL-NPCA to a toy and a large data problem, OL-NPCA shows following advantages. First, OL-NPCA is more efficient in memory requirement than KPCA. Second advantage is that OL-NPCA is comparable in performance to KPCA. Furthermore, performance of OL-NPCA can be easily improved by re-learning the data. For classification extracted features are used as input for least squares support vector machine. In our experiments we show that proposed feature extraction method is comparable in performance to a Kernel PCA and proposed classification system shows a high classification performance on UCI benchmarking data and NIST handwritten data set.