The PDF projection theorem and the class-specific method

  • Authors:
  • P.M. Baggenstoss

  • Affiliations:
  • Naval Undersea Warfare Center, Newport, RI, USA

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2003

Quantified Score

Hi-index 35.69

Visualization

Abstract

We present the theoretical foundation for optimal classification using class-specific features and provide examples of its use. A new probability density function (PDF) projection theorem makes it possible to project probability density functions from a low-dimensional feature space back to the raw data space. An M-ary classifier is constructed by estimating the PDFs of class-specific features, then transforming each PDF back to the raw data space where they can be fairly compared. Although statistical sufficiency is not a requirement, the classifier thus constructed becomes equivalent to the optimal Bayes classifier if the features meet sufficiency requirements individually for each class. This classifier is completely modular and avoids the dimensionality curse associated with large complex problems. By recursive application of the projection theorem, it is possible to analyze complex signal processing chains. We apply the method to feature sets, including linear functions of independent random variables, cepstrum, and Mel cepstrum. In addition, we demonstrate how it is possible to automate the feature and model selection process by direct comparison of log-likelihood values on the common raw data domain.