Mixture of neural networks: some experiments with the multilayer feedforward architecture

  • Authors:
  • Joaquín Torres-Sospedra;Carlos Hernández-Espinosa;Mercedes Fernández-Redondo

  • Affiliations:
  • Departamento de Ingenieria y Ciencia de los Computadores, Universitat Jaume I, Castellon, Spain;Departamento de Ingenieria y Ciencia de los Computadores, Universitat Jaume I, Castellon, Spain;Departamento de Ingenieria y Ciencia de los Computadores, Universitat Jaume I, Castellon, Spain

  • Venue:
  • ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

A Modular Multi-Net System consist on some networks which solve partially a problem. The original problem has been decomposed into subproblems and each network focuses on solving a subproblem. The Mixture of Neural Networks consist on some expert networks which solve the subproblems and a gating network which weights the outputs of the expert networks. The expert networks and the gating network are trained all together in order to reduce the correlation among the networks and minimize the error of the system. In this paper we present the Mixture of Multilayer Feedforward (MixMF) a method based on MixNN which uses Multilayer Feedfoward networks for the expert level. Finally, we have performed a comparison among Simple Ensemble, MixNN and MixMF and the results show that MixMF is the best performing method.