A massively parallel architecture for a self-organizing neural pattern recognition machine
Computer Vision, Graphics, and Image Processing
Neural network implementation of fuzzy logic
Fuzzy Sets and Systems
1994 Special Issue: Winner-take-all networks for physiological models of competitive learning
Neural Networks - Special issue: models of neurodynamics and behavior
Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain
Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain
A novelty detection approach to classification
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
Complex and Adaptive Dynamical Systems: A Primer
Complex and Adaptive Dynamical Systems: A Primer
A gradient rule for the plasticity of a neuron’s intrinsic excitability
ICANN'05 Proceedings of the 15th international conference on Artificial Neural Networks: biological Inspirations - Volume Part I
Intrinsic adaptation in autonomous recurrent neural networks
Neural Computation
Mutual information and minimum mean-square error in Gaussian channels
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Learning algorithms need generally the ability to compare several streams of information. Neural learning architectures hence need a unit, a comparator, able to compare several inputs encoding either internal or external information, for instance, predictions and sensory readings. Without the possibility of comparing the values of predictions to actual sensory inputs, reward evaluation and supervised learning would not be possible. Comparators are usually not implemented explicitly. Necessary comparisons are commonly performed by directly comparing the respective activities one-to-one. This implies that the characteristics of the two input streams like size and encoding must be provided at the time of designing the system. It is, however, plausible that biological comparators emerge from self-organizing, genetically encoded principles, which allow the system to adapt to the changes in the input and the organism. We propose an unsupervised neural circuitry, where the function of input comparison emerges via self-organization only from the interaction of the system with the respective inputs, without external influence or supervision. The proposed neural comparator adapts in an unsupervised form according to the correlations present in the input streams. The system consists of a multilayer feedforward neural network, which follows a local output minimization anti-Hebbian rule for adaptation of the synaptic weights. The local output minimization allows the circuit to autonomously acquire the capability of comparing the neural activities received from different neural populations, which may differ in population size and the neural encoding used. The comparator is able to compare objects never encountered before in the sensory input streams and evaluate a measure of their similarity even when differently encoded.