Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Making large-scale support vector machine learning practical
Advances in kernel methods
Incomplete-data classification using logistic regression
ICML '05 Proceedings of the 22nd international conference on Machine learning
Region-based value iteration for partially observable Markov decision processes
ICML '06 Proceedings of the 23rd international conference on Machine learning
Learning from incomplete data with infinite imputations
Proceedings of the 25th international conference on Machine learning
Classification with Incomplete Data Using Dirichlet Process Priors
The Journal of Machine Learning Research
Semiconducting bilinear deep learning for incomplete image recognition
Proceedings of the 2nd ACM International Conference on Multimedia Retrieval
Hi-index | 0.00 |
We introduce quadratically gated mixture of experts (QGME), a statistical model for multi-class nonlinear classification. The QGME is formulated in the setting of incomplete data, where the data values are partially observed. We show that the missing values entail joint estimation of the data manifold and the classifier, which allows adaptive imputation during classifier learning. The expectation maximization (EM) algorithm is derived for joint likelihood maximization, with adaptive imputation performed analytically in the E-step. The performance of QGME is evaluated on three benchmark data sets and the results show that the QGME yields significant improvements over competing methods.