Asymptotically optimal discriminant functions for pattern classification

  • Authors:
  • C. Wolverton;T. Wagner

  • Affiliations:
  • -;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

The two category classification problem is treated. No a priori knowledge of the statistics of the classes is assumed. A sequence of labeled samples from the two classes is used to construct a sequence of approximations of a discriminant function that is optimum in the sense of minimizing the probability of misclassification but which requires knowledge of all the statistics of the classes. Depending on the assumptions made about the probability densities corresponding to the two classes, the integrated square error of the approximations converges to0in probability or with probability1. The approximations are nonparametric and recursive for each fixed point of the domain. Rates of convergence are given. The approximations are used to define a decision procedure for classifying unlabeled samples. It is shown that as the number of labeled samples used to construct the approximations increases, the resulting sequence of discriminant functions is asymptotically optimal in the sense that the probability of misclassification when using the approximations in the decision procedure converges in probability or with probability1, depending on the assumptions made, to the probability of misclassification of the optimum discriminant function. The results can be easily extended to the multicategory problem and to the case of arbitrary loss functions, that is, where the costs of misclassification are not necessarily equal to1.