Second-Order Methods for Neural Networks
Second-Order Methods for Neural Networks
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Three learning phases for radial-basis-function networks
Neural Networks
Maximum-Likelihood Design of Layered Neural Networks
ICPR '96 Proceedings of the International Conference on Pattern Recognition (ICPR '96) Volume IV-Volume 7472 - Volume 7472
Class Conditional Density Estimation Using Mixtures with Constrained Component Sharing
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Theory of Networks for Approximation and Learning
A Theory of Networks for Approximation and Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
HIS '05 Proceedings of the Fifth International Conference on Hybrid Intelligent Systems
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Evolutionary optimization of radial basis function classifiers for data mining applications
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Radial basis function networks, regression weights, and the expectation-maximization algorithm
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
IEEE Transactions on Neural Networks
Shared kernel models for class conditional density estimation
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A novel radial basis function neural network for discriminant analysis
IEEE Transactions on Neural Networks
An incremental training method for the probabilistic RBF network
IEEE Transactions on Neural Networks
Unsupervised Learning of Gaussian Mixtures Based on Variational Component Splitting
IEEE Transactions on Neural Networks
Information Sciences: an International Journal
So near and yet so far: New insight into properties of some well-known classifier paradigms
Information Sciences: an International Journal
Techniques for knowledge acquisition in dynamically changing environments
ACM Transactions on Autonomous and Adaptive Systems (TAAS) - Special section on formal methods in pervasive computing, pervasive adaptation, and self-adaptive systems: Models and algorithms
Learning from others: Exchange of classification rules in intelligent distributed systems
Artificial Intelligence
Information Sciences: an International Journal
Hi-index | 0.00 |
For classification tasks, the application of generative classifiers sometimes has advantages over the use of exclusively discriminative classifiers because loss functions can be considered or rejection criteria can be defined more easily, for instance. We show how a radial basis function (RBF) network with multivariate (elliptical) Gaussian basis functions can be trained in two different ways to obtain a classifier with either a more generative or a more discriminative behavior. Our generative classifier allows a probabilistic interpretation of the external outputs (posterior probability of class membership) and the hidden neurons' activations (posterior probability of a component of the model). For that purpose a variational Bayesian inference approach is applied, which also finds an appropriate number of hidden neurons (i.e., components) "on the fly". A discriminative classifier is obtained using the resilient propagation training technique. We investigate the properties of the two training techniques in detail by introducing a measure for generative properties of the trained classifiers and by comparing these classifiers on various data sets.