Diverse Evolutionary Neural Networks Based on Information Theory

  • Authors:
  • Kyung-Joong Kim;Sung-Bae Cho

  • Affiliations:
  • Department of Computer Science, Yonsei University, Seoul, South Korea 120-749;Department of Computer Science, Yonsei University, Seoul, South Korea 120-749

  • Venue:
  • Neural Information Processing
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

There is no consensus on measuring distances between two different neural network architectures. Two folds of methods are used for that purpose: Structural and behavioral distance measures. In this paper, we focus on the later one that compares differences based on output responses given the same input. Usually neural network output can be interpreted as a probabilistic function given the input signals if it is normalized to 1. Information theoretic distance measures are widely used to measure distances between two probabilistic distributions. In the framework of evolving diverse neural networks, we adopted information-theoretic distance measures to improve its performance. Experimental results on UCI benchmark dataset show the promising possibility of the approach.