Neural networks: a systematic introduction
Neural networks: a systematic introduction
Feature Extraction Using Information-Theoretic Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Training MLPs layer-by-layer with the information potential
ICASSP '99 Proceedings of the Acoustics, Speech, and Signal Processing, 1999. on 1999 IEEE International Conference - Volume 02
Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives
Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives
Hi-index | 0.00 |
Multilayer perceptrons can be trained with several algorithms and with different quantities that correlate the expected output and the achieved state. Among the most common of those quantities is the mean square error, but information-theoretic quantities have been applied with great success. A common scheme to train multilayer perceptrons is based in evolutionary computing, as a counterpart of the commonly applied backpropagation algorithm. In this contribution we evaluated the performance of multilayer perceptrons as classifiers when trained with genetic algorithms and applying mutual information between the label obtained by the network and the expected class. We propose a classification algorithm in which each input variable is substituted by a function of it such that mutual information from the new function to the label is maximized. Next, those approximated functions are fed as input to a multilayer perceptron in charge of learning the classification map, trained with genetic algorithms and guided by mutual information.