A study of ensemble of hybrid networks with strong regularization

  • Authors:
  • Shimon Cohen;Nathan Intrator

  • Affiliations:
  • School of Computer Science, Tel Aviv University, Ramat Aviv, Israel;School of Computer Science, Tel Aviv University, Ramat Aviv, Israel and Institute for Brain and Neural Systems, Brown University, Providence, RI

  • Venue:
  • MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study various ensemble methods for hybrid neural networks. The hybrid networks are composed of radial and projection units and are trained using a deterministic algorithm that completely defines the parameters of the network for a given data set. Thus, there is no random selection of the initial (and final) parameters as in other training algorithms. Network independent is achieved by using bootstrap and boosting methods as well as random input sub-space sampling. The fusion methods are evaluated on several classification benchmark data-sets. A novel MDL based fusion method appears to reduce the variance of the classification scheme and sometimes be superior in its overall performance.