Numerical recipes in C: the art of scientific computing
Numerical recipes in C: the art of scientific computing
Dynamics and architecture for neural computation
Journal of Complexity - Special Issue on Neural Computation
Connectionist learning procedures
Artificial Intelligence
Deterministic Boltzmann learning performs steepest descent in weight-space
Neural Computation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Learning and relearning in Boltzmann machines
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
A biologically supported error-correcting learning rule
Neural Computation
Discovering high order features with mean filed modules
Advances in neural information processing systems 2
The neurobiological significance of the new learning models
Computational neuroscience
Training products of experts by minimizing contrastive divergence
Neural Computation
The development of cortical models to enable neural-based cognitive architectures
Computational models for neuroscience
Supervised Learning Through Neuronal Response Modulation
Neural Computation
Hold your horses: a dynamic computational role for the subthalamic nucleus in decision making
Neural Networks - 2006 Special issue: Neurobiology of decision making
A Neural Network for Creative Serial Order Cognitive Behavior
Minds and Machines
A Connectionist Thematic Grid Predictor for Pre-parsed Natural Language Sentences
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Part II--Advances in Neural Networks
SABIO: A BIOLOGICALLY PLAUSIBLE CONNECTIONIST APPROACH TO AUTOMATIC TEXT SUMMARIZATION
Applied Artificial Intelligence
SAL: an explicitly pluralistic cognitive architecture
Journal of Experimental & Theoretical Artificial Intelligence - Pluralism and the Future of Cognitive Science
Predictive models in the brain
Connection Science
Local coupled feedforward neural network
Neural Networks
Toward a Unified Catalog of Implemented Cognitive Architectures
Proceedings of the 2010 conference on Biologically Inspired Cognitive Architectures 2010: Proceedings of the First Annual Meeting of the BICA Society
PROPOR'06 Proceedings of the 7th international conference on Computational Processing of the Portuguese Language
A computational model of action resonance and its modulation by emotional stimulation
Cognitive Systems Research
Journal of Cognitive Neuroscience
Hi-index | 0.00 |
The error backpropagation learning algorithm (BP) is generally considered biologically implausible because it does not use locally available, activation-based variables. A version of BP that can be computed locally using bidirectional activation recirculation (Hinton and McClelland 1988) instead of backpropagated error derivatives is more biologically plausible. This paper presents a generalized version of the recirculation algorithm (GeneRec), which overcomes several limitations of the earlier algorithm by using a generic recurrent network with sigmoidal units that can learn arbitrary input/output mappings. However, the contrastive Hebbian learning algorithm (CHL, also known as DBM or mean field learning) also uses local variables to perform error-driven learning in a sigmoidal recurrent network. CHL was derived in a stochastic framework (the Boltzmann machine), but has been extended to the deterministic case in various ways, all of which rely on problematic approximations and assumptions, leading some to conclude that it is fundamentally flawed. This paper shows that CHL can be derived instead from within the BP framework via the GeneRec algorithm. CHL is a symmetry-preserving version of GeneRec that uses a simple approximation to the midpoint or second-order accurate Runge-Kutta method of numerical integration, which explains the generally faster learning speed of CHL compared to BI. Thus, all known fully general error-driven learning algorithms that use local activation-based variables in deterministic networks can be considered variations of the GeneRec algorithm (and indirectly, of the backpropagation algorithm). GeneRec therefore provides a promising framework for thinking about how the brain might perform error-driven learning. To further this goal, an explicit biological mechanism is proposed that would be capable of implementing GeneRec-style learning. This mechanism is consistent with available evidence regarding synaptic modification in neurons in the neocortex and hippocampus, and makes further predictions.