Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Adaptive mixtures of local experts
Neural Computation
On the asymptotic normality of hierarchical mixtures-of-experts for generalized linear models
IEEE Transactions on Information Theory
Hi-index | 0.01 |
The statistical properties of the likelihood ratio test statistic (LRTS) for mixture-of-expert models are addressed in this paper. This question is essential when estimating the number of experts in the model. Our purpose is to extend the existing results for simple mixture models (Liu and Shao, 2003 [8]) and mixtures of multilayer perceptrons (Olteanu and Rynkiewicz, 2008 [9]). In this paper we first study a simple example which embodies all the difficulties arising in such models. We find that in the most general case the LRTS diverges but, with additional assumptions, the behavior of such models can be totally explicated.