Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Maximum conditional likelihood via bound maximization and the CEM algorithm
Proceedings of the 1998 conference on Advances in neural information processing systems II
An entropic estimator for structure discovery
Proceedings of the 1998 conference on Advances in neural information processing systems II
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Discriminative training of Gaussian mixture models for large vocabulary speech recognition systems
ICASSP '96 Proceedings of the Acoustics, Speech, and Signal Processing, 1996. on Conference Proceedings., 1996 IEEE International Conference - Volume 02
Shared kernel models for class conditional density estimation
IEEE Transactions on Neural Networks
Real-Time Gesture Recognition by Learning and Selective Control of Visual Interest Points
IEEE Transactions on Pattern Analysis and Machine Intelligence
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Efficiently explaining decisions of probabilistic RBF classification networks
ICANNGA'11 Proceedings of the 10th international conference on Adaptive and natural computing algorithms - Volume Part I
Active learning with the probabilistic RBF classifier
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Quality of classification explanations with PRBF
Neurocomputing
Hi-index | 0.14 |
We propose a generative mixture model classifier that allows for the class conditional densities to be represented by mixtures having certain subsets of their components shared or common among classes. We argue that, when the total number of mixture components is kept fixed, the most efficient classification model is obtained by appropriately determining the sharing of components among class conditional densities. In order to discover such an efficient model, a training method is derived based on the EM algorithm that automatically adjusts component sharing. We provide experimental results with good classification performance.