A massively parallel architecture for a self-organizing neural pattern recognition machine
Computer Vision, Graphics, and Image Processing
The capacity of the Hopfield associative memory
IEEE Transactions on Information Theory
Capacity of associative memory using a nonmonotonic neuron model
Neural Networks
The Handbook of Brain Theory and Neural Networks
The Handbook of Brain Theory and Neural Networks
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
2007 Special Issue: Consciousness CLEARS the mind
Neural Networks
IEEE Transactions on Computers
Large memory capacity in chaotic artificial neural networks: a view of the anti-integrable limit
IEEE Transactions on Neural Networks
Auto-associative memory with two-stage dynamics of nonmonotonic neurons
IEEE Transactions on Neural Networks
Dynamics of projective adaptive resonance theory model: the foundation of PART algorithm
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We consider the issue of how a simple network with delayed feedback can exhibit complex but desired dynamical behaviors for memory storage and retrieval. We discuss the simplicity-capacity dilemma arising from the requirement of both large capacity and easy implementation in additive networks. We then propose a novel approach based on signal processing delay and show that the interaction of delay, feedback and refractoriness in a simple inhibitory network of three neurons can generate mathematically trackable coexisting periodic patterns. Therefore, a simple and small network with delayed feedback can process a large amount of information, and time lag in our biological or artificial neural nets is useful for information processing. How the connection topology of a large network enhances the network's capacity for memory storage and retrieval remains to be an interesting task.