Elements of information theory
Elements of information theory
Neural Computation
Energy-efficient coding with discrete stochastic events
Neural Computation
Representational accuracy of stochastic neural populations
Neural Computation
Optimal short-term population coding: when fisher information fails
Neural Computation
Metabolically Efficient Information Processing
Neural Computation
Analytical Model for the Effects of Learning on Spike Count Distributions
Neural Computation
Efficient Computation Based on Stochastic Spikes
Neural Computation
Distortion of Neural Signals by Spike Coding
Neural Computation
Neuro-electrophysiological Argument on Energy Coding
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks
Energy-efficient interspike interval codes
Neurocomputing
Synaptic failures and a Gaussian excitation distribution
Neurocomputing
Conduction velocity costs energy
Neurocomputing
Capacity analysis for integrate-and-fire neurons with descending action potential thresholds
IEEE Transactions on Information Theory - Special issue on information theory in molecular biology and neuroscience
A mathematical theory of energy efficient neural computation and communication
IEEE Transactions on Information Theory - Special issue on information theory in molecular biology and neuroscience
Local non-linear interactions in the visual cortex may reflect global decorrelation
Journal of Computational Neuroscience
Task-Oriented sparse coding model for pattern classification
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
A new mechanism on brain information processing—energy coding
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Journal of Computational Neuroscience
Intelligence and embodiment: A statistical mechanics approach
Neural Networks
Combinatorial neural codes from a mathematical coding theory perspective
Neural Computation
Hi-index | 0.00 |
In 1969 Barlow introduced the phrase “economy of impulses” to express the tendency for successive neural systems to use lower and lower levels of cell firings to produce equivalent encodings. From this viewpoint, the ultimate economy of impulses is a neural code of minimal redundancy. The hypothesis motivating our research is that energy expenditures, e.g., the metabolic cost of recovering from an action potential relative to the cost of inactivity, should also be factored into the economy of impulses. In fact, coding schemes with the largest representational capacity are not, in general, optimal when energy expenditures are taken into account. We show that for both binary and analog neurons, increased energy expenditure per neuron implies a decrease in average firing rate if energy efficient information transmission is to be maintained.