Estimating the number of components in a mixture of multilayer perceptrons

  • Authors:
  • M. Olteanu;J. Rynkiewicz

  • Affiliations:
  • SAMOS-MATISSE-CES Universite Paris 1, UMR 8174, 90 Rue de Tolbiac, 75013 Paris, France;SAMOS-MATISSE-CES Universite Paris 1, UMR 8174, 90 Rue de Tolbiac, 75013 Paris, France

  • Venue:
  • Neurocomputing
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

Bayesian information criterion (BIC) criterion is widely used by the neural-network community for model selection tasks, although its convergence properties are not always theoretically established. In this paper we will focus on estimating the number of components in a mixture of multilayer perceptrons and proving the convergence of the BIC criterion in this frame. The penalized marginal-likelihood for mixture models and hidden Markov models introduced by Keribin [Consistent estimation of the order of mixture models, Sankhya Indian J. Stat. 62 (2000) 49-66] and, respectively, Gassiat [Likelihood ratio inequalities with applications to various mixtures, Ann. Inst. Henri Poincare 38 (2002) 897-906] is extended to mixtures of multilayer perceptrons for which a penalized-likelihood criterion is proposed. We prove its convergence under some hypothesis which involve essentially the bracketing entropy of the generalized score-function class and illustrate it by some numerical examples.