Bayesian nonlinear model selection and neural networks: a conjugate prior approach

  • Authors:
  • J. -P. Vila;V. Wagner;P. Neveu

  • Affiliations:
  • Lab. d'Analyse des Syst. et de Biometrie, INRA-ENSAM, Montpellier;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2000

Quantified Score

Hi-index 0.01

Visualization

Abstract

In order to select the best predictive neural-network architecture in a set of several candidate networks, we propose a general Bayesian nonlinear regression model comparison procedure, based on the maximization of an expected utility criterion. This criterion selects the model under which the training set achieves the highest level of internal consistency, through the predictive probability distribution of each model. The density of this distribution is computed as the model posterior predictive density and is asymptotically approximated from the assumed Gaussian likelihood of the data set and the related conjugate prior density of the parameters. The use of such a conjugate prior allows the analytic calculation of the parameter posterior and predictive posterior densities, in an empirical Bayes-like approach. This Bayesian selection procedure allows us to compare general nonlinear regression models and in particular feedforward neural networks, in addition to embedded models as usual with asymptotic comparison tests