Principal component analysis based on non-parametric maximum entropy

  • Authors:
  • Ran He;Baogang Hu;XiaoTong Yuan;Wei-Shi Zheng

  • Affiliations:
  • School of Electronic and Information Engineering, Dalian University of Technology, Dalian 116024, People's Republic of China;National Laboratory of Pattern Recognition, Institute of Automation Chinese Academy of Sciences, Beijing 100190, People's Republic of China;Department of Electrical and Computer Engineering, National University of Singapore, Singapore;Department of Computer Science, Queen Mary University of London, London, UK

  • Venue:
  • Neurocomputing
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper, we propose an improved principal component analysis based on maximum entropy (MaxEnt) preservation, called MaxEnt-PCA, which is derived from a Parzen window estimation of Renyi's quadratic entropy. Instead of minimizing the reconstruction error either based on L"2-norm or L"1-norm, the MaxEnt-PCA attempts to preserve as much as possible the uncertainty information of the data measured by entropy. The optimal solution of MaxEnt-PCA consists of the eigenvectors of a Laplacian probability matrix corresponding to the MaxEnt distribution. MaxEnt-PCA (1) is rotation invariant, (2) is free from any distribution assumption, and (3) is robust to outliers. Extensive experiments on real-world datasets demonstrate the effectiveness of the proposed linear method as compared to other related robust PCA methods.