Adaptive signal processing
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Backpropagation: theory, architectures, and applications
Backpropagation: theory, architectures, and applications
A Fast Stochastic Error-Descent Algorithm for Supervised Learning and Optimization
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Do Simple Cells in Primary Visual Cortex Form a Tight Frame?
Neural Computation
Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems
Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems
Hi-index | 0.00 |
Neural networks that are trained to perform specific tasks must be developed through a supervised learning procedure. This normally takes the form of direct supervision of synaptic plasticity. We explore the idea that supervision takes place instead through the modulation of neuronal excitability. Such supervision can be done using conventional synaptic feedback pathways rather than requiring the hypothetical actions of unknown modulatory agents. During task learning, supervised response modulation guides Hebbian synaptic plasticity indirectly by establishing appropriate patterns of correlated network activity. This results in robust learning of function approximation tasks evenwhenmultiple output units representing different functions share large amounts of common input. Reward-based supervision is also studied, and a number of potential advantages of neuronal response modulation are identified.