Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Face recognition: A literature survey
ACM Computing Surveys (CSUR)
Journal of Cognitive Neuroscience
Pattern Recognition
Efficient eigen-updating for spectral graph clustering
Neurocomputing
Hi-index | 0.00 |
Kernel principal component analysis (KPCA) is a widely-used statistical method for representation learning, where PCA is performed in reproducing kernel Hilbert space (RKHS) to extract nonlinear features from a set of training examples. Despite the success in various applications including face recognition, KPCA does not scale up well with the sample size, since, as in other kernel methods, it involves the eigen-decomposition of n ×n Gram matrix which is solved in ${\mathcal{O}}(n^3)$ time. Nyström method is an approximation technique, where only a subset of size m≪n is exploited to approximate the eigenvectors of n ×n Gram matrix. In this paper we consider Nyström method and its few modifications such as 'Nyström KPCA ensemble' and 'Nyström + randomized SVD' to improve the scalability of KPCA. We compare the performance of these methods in the task of learning face descriptors for face recognition.