Elements of information theory
Elements of information theory
A Method of Combining Multiple Experts for the Recognition of Unconstrained Handwritten Numerals
IEEE Transactions on Pattern Analysis and Machine Intelligence
Evolving neural networks through augmenting topologies
Evolutionary Computation
Learning and Evolution by Minimization of Mutual Information
PPSN VII Proceedings of the 7th International Conference on Parallel Problem Solving from Nature
Multi-modal Image Registration by Minimising Kullback-Leibler Distance
MICCAI '02 Proceedings of the 5th International Conference on Medical Image Computing and Computer-Assisted Intervention-Part II
Neural Computation
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
Genetic drift in genetic algorithm selection schemes
IEEE Transactions on Evolutionary Computation
Wavelet-based texture retrieval using generalized Gaussian density and Kullback-Leibler distance
IEEE Transactions on Image Processing
A new evolutionary system for evolving artificial neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
There is no consensus on measuring distances between two different neural network architectures. Two folds of methods are used for that purpose: Structural and behavioral distance measures. In this paper, we focus on the later one that compares differences based on output responses given the same input. Usually neural network output can be interpreted as a probabilistic function given the input signals if it is normalized to 1. Information theoretic distance measures are widely used to measure distances between two probabilistic distributions. In the framework of evolving diverse neural networks, we adopted information-theoretic distance measures to improve its performance. Experimental results on UCI benchmark dataset show the promising possibility of the approach.