Pattern recognition: human and mechanical
Pattern recognition: human and mechanical
Adaptation and decorrelation in the cortex
The computing neuron
Optimal Extraction of Hidden Causes
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Neural Computation
Semilinear predictability minimization produces well-known feature detectors
Neural Computation
Reduced representation by neural networks with restricted receptive fields
Neural Computation
Competition and multiple cause models
Neural Computation
A NON-PARAMETRIC APPROACH TO SIMPLICITY CLUSTERING
Applied Artificial Intelligence
Reduction of noise due to systematic uncertainties in 113mIn SPECT imaging using information theory
Computers in Biology and Medicine
Anticipatory Behavior in Adaptive Learning Systems
Structural enhanced information to detect features in competitive learning
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
DS'07 Proceedings of the 10th international conference on Discovery science
Hi-index | 0.00 |
To determine whether a particular sensory event is a reliable predictor of reward or punishment it is necessary to know the prior probability of that event. If the variables of a sensory representation normally occur independently of each other, then it is possible to derive the prior probability of any logical function of the variables from the prior probabilities of the individual variables, without any additional knowledge; hence such a representation enormously enlarges the scope of definable events that can be searched for reliable predictors. Finding a Minimum Entropy Code is a possible method of forming such a representation, and methods for doing this are explored in this paper. The main results are (1) to show how to find such a code when the probabilities of the input states form a geometric progression, as is shown to be nearly true for keyboard characters in normal text; (2) to show how a Minimum Entropy Code can be approximated by repeatedly recoding pairs, triples, etc. of an original 7-bit code for keyboard characters; (3) to prove that in some cases enlarging the capacity of the output channel can lower the entropy.