Neural classifiers and statistical pattern recognition: applications for currently established links

  • Authors:
  • H. Osman;M. M. Fahmy

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., Queen's Univ., Kingston, Ont.;-

  • Venue:
  • IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recent research has linked backpropagation (BP) and radial basis function (RBF) network classifiers, trained by minimizing the standard mean square error (MSE), to two main topics in statistical pattern recognition (SPR), namely the Bayes decision theory and discriminant analysis. However, so far, the establishment of these links has resulted in only a few practical applications for training, using, and evaluating these classifiers. The paper aims at providing more of these applications. It first illustrates that while training a linear output BP network, the explicit utilization of the network discriminant capability leads to an improvement in its classification performance. Then, for linear output BP and RBF networks, the paper defines a new generalization measure that provides information about the closeness of the network classification performance to the optimal performance. The estimation procedure of this measure is described and its use as an efficient criterion for terminating the learning algorithm and choosing the network topology is explained. The paper finally proposes an upper bound on the number of hidden units needed by an RBF network classifier to achieve an arbitrary value of the minimized MSE. Experimental results are presented to validate all proposed applications