Learning invariance from transformation sequences
Neural Computation
Regularization theory and neural networks architectures
Neural Computation
Shunting inhibition does not have a divisive effect on firing rates
Neural Computation
Biophysiologically plausible implementations of the maximum operation
Neural Computation
A Theory of Networks for Approximation and Learning
A Theory of Networks for Approximation and Learning
A canonical microcircuit for neocortex
Neural Computation
A winner-take-all mechanism based on presynaptic inhibition feedback
Neural Computation
Fast learning in networks of locally-tuned processing units
Neural Computation
Competitive stdp-based spike pattern learning
Neural Computation
Bio-inspired Connectionist Architecture for Visual Detection and Refinement of Shapes
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part II
Local non-linear interactions in the visual cortex may reflect global decorrelation
Journal of Computational Neuroscience
Local model for contextual modulation in the cerebral cortex
Neural Networks
Suitability of V1 energy models for object classification
Neural Computation
Hierarchical kernel-based rotation and scale invariant similarity
Pattern Recognition
Hi-index | 0.00 |
A few distinct cortical operations have been postulated over the past few years, suggested by experimental data on nonlinear neural response across different areas in the cortex. Among these, the energy model proposes the summation of quadrature pairs following a squaring nonlinearity in order to explain phase invariance of complex V1 cells. The divisive normalization model assumes a gain-controlling, divisive inhibition to explain sigmoid-like response profiles within a pool of neurons. A gaussian-like operation hypothesizes a bell-shaped response tuned to a specific, optimal pattern of activation of the presynaptic inputs. A max-like operation assumes the selection and transmission of the most active response among a set of neural inputs. We propose that these distinct neural operations can be computed by the same canonical circuitry, involving divisive normalization and polynomial nonlinearities, for different parameter values within the circuit. Hence, this canonical circuit may provide a unifying framework for several circuit models, such as the divisive normalization and the energy models. As a case in point, we consider a feedforward hierarchical model of the ventral pathway of the primate visual cortex, which is built on a combination of the gaussian-like and max-like operations. We show that when the two operations are approximated by the circuit proposed here, the model is capable of generating selective and invariant neural responses and performing object recognition, in good agreement with neurophysiological data.