Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Efficient Approximations for the MarginalLikelihood of Bayesian Networks with Hidden Variables
Machine Learning - Special issue on learning with probabilistic representations
Deterministic annealing EM algorithm
Neural Networks
Machine Learning
Sparse Multinomial Logistic Regression: Fast Algorithms and Generalization Bounds
IEEE Transactions on Pattern Analysis and Machine Intelligence
Subclass Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Subclass Problem-Dependent Design for Error-Correcting Output Codes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustering Inside Classes Improves Performance of Linear Classifiers
ICTAI '08 Proceedings of the 2008 20th IEEE International Conference on Tools with Artificial Intelligence - Volume 02
A general algorithm for approximate inference and its application to hybrid bayes nets
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
UAI'00 Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence
Discriminatively Trained GMMs for Language Classification Using Boosting Methods
IEEE Transactions on Audio, Speech, and Language Processing
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Variational Bayesian Learning of Probabilistic Discriminative Models With Latent Softmax Variables
IEEE Transactions on Signal Processing
Hi-index | 12.05 |
Discriminative subclass models can provide good estimates of complex 'continuous to discrete' conditional probabilities for hybrid Bayesian network models. However, the conventional approach of specifying deterministic 'hard' subclasses via unsupervised clustering can lead to inaccurate models. The multimodal softmax (MMS) model is presented as a new probabilistic discriminative subclass model that overcomes this unreliability. By invoking fully probabilistic latent 'soft' subclasses, MMS permits learning via standard statistical methods without requiring explicit clustering/relabeling of data. MMS is also shown to be closely related to the mixture of experts model and the generative Gaussian mixture classifier. Synthetic and benchmark classification results demonstrate the MMS model's correctness and usefulness for hybrid probabilistic modeling.