Deterministic Boltzmann learning performs steepest descent in weight-space
Neural Computation
Attractor dynamics and parallelism in a connectionist sequential machine
Artificial neural networks
A method for the associative storage of analog vectors
Advances in neural information processing systems 2
Shaping the state space landscape in recurrent networks
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Neural Computation
Varieties of Helmholtz machine
Neural Networks - 1996 Special issue: four major hypotheses in neuroscience
Neural Computation
Gradient calculations for dynamic recurrent neural networks: a survey
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We propose that the critical function of sleep is to prevent uncontrolled neuronal feedback while allowing rapid responses and prolonged retention of short-term memories. Through learning, the brain is tuned to react optimally to environmental challenges. Optimal behavior often requires rapid responses and the prolonged retention of short-term memories. At a neuronal level, these correspond to recurrent activity in local networks. Unfortunately, when a network exhibits recurrent activity, small changes in the parameters or conditions can lead to runaway oscillations. Thus, the very changes that improve the processing performance of the network can put it at risk of runaway oscillation. To prevent this, stimulus-dependent network changes should be permitted only when there is a margin of safety around the current network parameters. We propose that the essential role of sleep is to establish this margin by exposing the network to a variety of inputs, monitoring for erratic behavior, and adjusting the parameters. When sleep is not possible, an emergency mechanism must come into play, preventing runaway behavior at the expense of processing efficiency. This is tiredness.