Training of radial basis function classifiers with resilient propagation and variational Bayesian inference

  • Authors:
  • Dominik Fisch;Bernhard Sick

  • Affiliations:
  • Faculty of Computer Science and Mathematics, University of Passau, Germany;Faculty of Computer Science and Mathematics, University of Passau, Germany

  • Venue:
  • IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

For classification tasks, the application of generative classifiers sometimes has advantages over the use of exclusively discriminative classifiers because loss functions can be considered or rejection criteria can be defined more easily, for instance. We show how a radial basis function (RBF) network with multivariate (elliptical) Gaussian basis functions can be trained in two different ways to obtain a classifier with either a more generative or a more discriminative behavior. Our generative classifier allows a probabilistic interpretation of the external outputs (posterior probability of class membership) and the hidden neurons' activations (posterior probability of a component of the model). For that purpose a variational Bayesian inference approach is applied, which also finds an appropriate number of hidden neurons (i.e., components) "on the fly". A discriminative classifier is obtained using the resilient propagation training technique. We investigate the properties of the two training techniques in detail by introducing a measure for generative properties of the trained classifiers and by comparing these classifiers on various data sets.