Where Are Linear Feature Extraction Methods Applicable?
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast protein superfamily classification using principal component null space analysis
AI'05 Proceedings of the 18th Canadian Society conference on Advances in Artificial Intelligence
A Novel bayesian classifier with smaller eigenvalues reset by threshold based on given database
ICIAR'07 Proceedings of the 4th international conference on Image Analysis and Recognition
Hi-index | 0.00 |
In a previous paper [A linear classifier for gaussian class conditional distributions with unequal covariance matrices], we have presented a new linear classification algorithm, Principal Component Null Space Analysis (PC-NSA) which is designed for problems like object recognition where different classes have unequal and non-white noise covariance matrices. PCNSA first obtains a principal components space (PCA space) for the entire data and in this PCA space, it finds for each class 'i', and M{i} dimensional subspace along which the class's intra-class variance is the smallest.We call this subspace an Approximate Null Space (ANS) since the lowest variance is usually "much smaller" than the highest.A query is classified into class 'i' if its distance from the class's mean in the class's ANS is a minimum.In this paper, we discuss the PCNSA algorithm more precisely and derive tight upper bounds on its classification error probability.We use these expressions to compare classification performance of PCNSA with that of Subspace Linear Discriminant Analysis (SLDA)[Subspace linear discriminant analysis for face recognition].