Asymptotic properties of mixture-of-experts models

  • Authors:
  • M. Olteanu;J. Rynkiewicz

  • Affiliations:
  • SAMM, EA 4543, Universite Paris 1, 90 Rue de Tolbiac, 75013 Paris, France;SAMM, EA 4543, Universite Paris 1, 90 Rue de Tolbiac, 75013 Paris, France

  • Venue:
  • Neurocomputing
  • Year:
  • 2011

Quantified Score

Hi-index 0.01

Visualization

Abstract

The statistical properties of the likelihood ratio test statistic (LRTS) for mixture-of-expert models are addressed in this paper. This question is essential when estimating the number of experts in the model. Our purpose is to extend the existing results for simple mixture models (Liu and Shao, 2003 [8]) and mixtures of multilayer perceptrons (Olteanu and Rynkiewicz, 2008 [9]). In this paper we first study a simple example which embodies all the difficulties arising in such models. We find that in the most general case the LRTS diverges but, with additional assumptions, the behavior of such models can be totally explicated.