Marr's theory of the neocortex as a self-organizing neural network
Neural Computation
Fixed-point attractor analysis for a class of neurodynamics
Neural Computation
Axonal processes and neural plasticity: a reply
Neural Computation
A Mathematical Analysis of a Correlation Based Model for the Orientation Map Formation
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
A Role of Constraint in Self-Organization
RANDOM '98 Proceedings of the Second International Workshop on Randomization and Approximation Techniques in Computer Science
An analysis of synaptic normalization in a general class of Hebbian models
Neural Computation
Synaptic weight normalization effects for topographic mapping formation
Neural Networks - 2004 Special issue: New developments in self-organizing systems
Nonlinear Complex-Valued Extensions of Hebbian Learning: An Essay
Neural Computation
Effective Neuronal Learning with Ineffective Hebbian Learning Rules
Neural Computation
Intrinsic Stabilization of Output Rates by Spike-Based Hebbian Learning
Neural Computation
Tilt Aftereffects in a Self-Organizing Model of the Primary Visual Cortex
Neural Computation
Factor Analysis Using Delta-Rule Wake-Sleep Learning
Neural Computation
On neurodynamics with limiter function and linsker's developmental model
Neural Computation
Reduced representation by neural networks with restricted receptive fields
Neural Computation
Influence function analysis of pca and bcm learning
Neural Computation
Spiking neuron model for temporal sequence recognition
Neural Computation
Flexible and multistable pattern generation by evolving constrained plastic neurocontrollers
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Cross-talk induces bifurcations in nonlinear models of synaptic plasticity
Neural Computation
Investigating STDP and LTP in a spiking neural network
SAB'06 Proceedings of the 9th international conference on From Animals to Animats: simulation of Adaptive Behavior
Hebbian learning of recurrent connections: A geometrical perspective
Neural Computation
Computational Intelligence and Neuroscience
Hi-index | 0.00 |
Models of unsupervised, correlation-based (Hebbian) synapticplasticity are typically unstable: either all synapses grow untileach reaches the maximum allowed strength, or all synapses decay tozero strength. A common method of avoiding these outcomes is to usea constraint that conserves or limits the total synaptic strengthover a cell. We study the dynamic effects of such constraints.Two methods of enforcing a constraint are distinguished,multiplicative and subtractive. For otherwise linear learningrules, multiplicative enforcement of a constraint results indynamics that converge to the principal eigenvector of the operatordetermining unconstrained synaptic development. Subtractiveenforcement, in contrast, typically leads to a final state in whichalmost all synaptic strengths reach either the maximum or minimumallowed value. This final state is often dominated by weightconfigurations other than the principal eigenvector of theunconstrained operator. Multiplicative enforcement yields a"graded" receptive field in which most mutually correlated inputsare represented, whereas subtractive enforcement yields a receptivefield that is "sharpened" to a subset of maximally correlatedinputs. If two equivalent input populations (e.g., two eyes)innervate a common target, multiplicative enforcement preventstheir segregation (ocular dominance segregation) when the twopopulations are weakly correlated; whereas subtractive enforcementallows segregation under these circumstances.These results may be used to understand constraints both overoutput cells and over input cells. A variety of rules that canimplement constrained dynamics are discussed.