Discriminant subspace learning constrained by locally statistical uncorrelation for face recognition

  • Authors:
  • Yu Chen;Wei-Shi Zheng;Xiao-Hong Xu;Jian-Huang Lai

  • Affiliations:
  • Department of Applied Mathematics, South China Agricultural University, Guangzhou, Guangdong, 510642, China and School of Mathematics and Computational Science, SunYat-Sen University, Guangzhou, G ...;School of Information Science and Technology, SunYat-Sen University, Guangzhou, Guangdong, 510275, China and Guangdong Province Key Laboratory of Computational Science, China;Department of Applied Mathematics, South China Agricultural University, Guangzhou, Guangdong, 510642, China;School of Information Science and Technology, SunYat-Sen University, Guangzhou, Guangdong, 510275, China

  • Venue:
  • Neural Networks
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

High-dimensionality of data and the small sample size problem are two significant limitations for applying subspace methods which are favored by face recognition. In this paper, a new linear dimension reduction method called locally uncorrelated discriminant projections (LUDP) is proposed, which addresses the two problems from a new aspect. More specifically, we propose a locally uncorrelated criterion, which aims to decorrelate learned discriminant factors over data locally rather than globally. It has been shown that the statistical uncorrelation criterion is an important property for reducing dimension and learning robust discriminant projection as well. However, data are always locally distributed, so it is more important to explore locally statistical uncorrelated discriminant information over data. We impose this new constraint into a graph-based maximum margin analysis, so that LUDP also characterizes the local scatter as well as nonlocal scatter, seeking to find a projection that maximizes the difference, rather than the ratio between the nonlocal scatter and the local scatter. Experiments on ORL, Yale, Extended Yale face database B and FERET face database demonstrate the effectiveness of our proposed method.