Presupervised and post-supervised prototype classifier design

  • Authors:
  • L. I. Kuncheva;J. C. Bezdek

  • Affiliations:
  • Sch. of Math., Univ. of Wales, Bangor;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

We extend the nearest prototype classifier to a generalized nearest prototype classifier (GNPC). The GNPC uses “soft” labeling of the prototypes in the classes, thereby encompassing a variety of classifiers. Based on how the prototypes are found we distinguish between presupervised and post-supervised GNPC designs. We derive the conditions for optimality of two designs where prototypes represent: 1) the components of class-conditional mixture densities (presupervised design); or 2) the components of the unconditional mixture density (post-supervised design). An artificial data set and the “satimage” data set from the database ELENA are used to experimentally study the two approaches. A radial basis function network is used as a representative of each GNPC type. Neither the theoretical nor the experimental results indicate clear reasons to prefer one of the approaches. The post-supervised GNPC design tends to be more robust and less accurate than the presupervised one