Nyström approximations for scalable face recognition: a comparative study

  • Authors:
  • Jeong-Min Yun;Seungjin Choi

  • Affiliations:
  • Department of Computer Science, Pohang University of Science and Technology, Nam-gu, Pohang, Korea;Department of Computer Science, Pohang University of Science and Technology, Nam-gu, Pohang, Korea

  • Venue:
  • ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Kernel principal component analysis (KPCA) is a widely-used statistical method for representation learning, where PCA is performed in reproducing kernel Hilbert space (RKHS) to extract nonlinear features from a set of training examples. Despite the success in various applications including face recognition, KPCA does not scale up well with the sample size, since, as in other kernel methods, it involves the eigen-decomposition of n ×n Gram matrix which is solved in ${\mathcal{O}}(n^3)$ time. Nyström method is an approximation technique, where only a subset of size m≪n is exploited to approximate the eigenvectors of n ×n Gram matrix. In this paper we consider Nyström method and its few modifications such as 'Nyström KPCA ensemble' and 'Nyström + randomized SVD' to improve the scalability of KPCA. We compare the performance of these methods in the task of learning face descriptors for face recognition.