Principles and practice of information theory
Principles and practice of information theory
Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
A statistical approach to learning and generalization in layered neural networks
COLT '89 Proceedings of the second annual workshop on Computational learning theory
Introduction to the theory of neural computation
Introduction to the theory of neural computation
An Introduction to the Modeling of Neural Networks
An Introduction to the Modeling of Neural Networks
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Neural Computation
Hi-index | 0.00 |
We exhibit a duality between two perceptrons that allows us to compare the theoretical analysis of supervised and unsupervised learning tasks. The first perceptron has one output and is asked to learn a classification of p patterns. The second (dual) perceptron has p outputs and is asked to transmit as much information as possible on a distribution of inputs. We show in particular that the maximum information that can be stored in the couplings for the supervised learning task is equal to the maximum information that can be transmitted by the dual perceptron.