An efficient algorithm for Kernel two-dimensional principal component analysis

  • Authors:
  • Ning Sun;Hai-xian Wang;Zhen-hai Ji;Cai-rong Zou;Li Zhao

  • Affiliations:
  • Southeast University, Research Center of Learning Science, 210096, Nanjing, China and Southeast University, Department of Radio Engineering, 210096, Nanjing, China;Southeast University, Research Center of Learning Science, 210096, Nanjing, China;Southeast University, Research Center of Learning Science, 210096, Nanjing, China and Southeast University, Department of Radio Engineering, 210096, Nanjing, China;Southeast University, Department of Radio Engineering, 210096, Nanjing, China;Southeast University, Research Center of Learning Science, 210096, Nanjing, China and Southeast University, Department of Radio Engineering, 210096, Nanjing, China

  • Venue:
  • Neural Computing and Applications
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, a new approach called two-dimensional principal component analysis (2DPCA) has been proposed for face representation and recognition. The essence of 2DPCA is that it computes the eigenvectors of the so-called image covariance matrix without matrix-to-vector conversion. Kernel principal component analysis (KPCA) is a non-linear generation of the popular principal component analysis via the Kernel trick. Similarly, the Kernelization of 2DPCA can be benefit to develop the non-linear structures in the input data. However, the standard K2DPCA always suffers from the computational problem for using the image matrix directly. In this paper, we propose an efficient algorithm to speed up the training procedure of K2DPCA. The results of experiments on face recognition show that the proposed algorithm can achieve much more computational efficiency and remarkably save the memory-consuming compared to the standard K2DPCA.