Asymptotic Law of Likelihood Ratio for Multilayer Perceptron Models

  • Authors:
  • Joseph Rynkiewicz

  • Affiliations:
  • CES-SAMOS-MATISSE, Université de Paris 1, Paris 750013

  • Venue:
  • ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider regression models involving multilayer perceptrons (MLP) with one hidden layer and a Gaussian noise. The data are assumed to be generated by a true MLP model and the estimation of the parameters of the MLP is done by maximizing the likelihood of the model. When the number of hidden units of the model is over-estimated, the Fischer information matrix of the model is singular and the asymptotic behavior of the LR statistic is unknown or can be divergent if the set of possible parameter is too large. This paper deals with this case, and gives the exact asymptotic law of the LR statistic. Namely, if the parameters of the MLP lie in a suitable compact set, we show that the LR statistic converges to the maximum of the square of a Gaussian process indexed by a class of limit score functions.