Classification Probability Analysis of Principal Component Null Space Analysis

  • Authors:
  • Namrata Vaswani;Rama Chellappa

  • Affiliations:
  • University of Maryland, College Park;University of Maryland, College Park

  • Venue:
  • ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 1 - Volume 01
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

In a previous paper [A linear classifier for gaussian class conditional distributions with unequal covariance matrices], we have presented a new linear classification algorithm, Principal Component Null Space Analysis (PC-NSA) which is designed for problems like object recognition where different classes have unequal and non-white noise covariance matrices. PCNSA first obtains a principal components space (PCA space) for the entire data and in this PCA space, it finds for each class 'i', and M{i} dimensional subspace along which the class's intra-class variance is the smallest.We call this subspace an Approximate Null Space (ANS) since the lowest variance is usually "much smaller" than the highest.A query is classified into class 'i' if its distance from the class's mean in the class's ANS is a minimum.In this paper, we discuss the PCNSA algorithm more precisely and derive tight upper bounds on its classification error probability.We use these expressions to compare classification performance of PCNSA with that of Subspace Linear Discriminant Analysis (SLDA)[Subspace linear discriminant analysis for face recognition].