Population coding and decoding in a neural field: a computational study
Neural Computation
Synchronous firing and higher-order interactions in neuron pool
Neural Computation
Information-geometric measure for neural spikes
Neural Computation
Impact of Correlated Inputs on the Output of the Integrate-and-Fire Model
Neural Computation
Singularities Affect Dynamics of Learning in Neuromanifolds
Neural Computation
Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems
Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems
Spatially organized spike correlation in cat visual cortex
Neurocomputing
Generation of Synthetic Spike Trains with Defined Pairwise Correlations
Neural Computation
Generation of correlated spike trains
Neural Computation
Generating spike trains with specified correlation coefficients
Neural Computation
Measure of correlation orthogonal to change in firing rate
Neural Computation
Information geometry on hierarchy of probability distributions
IEEE Transactions on Information Theory
Dynamics of Learning in Multilayer Perceptrons Near Singularities
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Analysis of correlated spike trains is a hot topic of research in computational neuroscience. A general model of probability distributions for spikes includes too many parameters to be of use in analyzing real data. Instead, we need a simple but powerful generative model for correlated spikes. We developed a class of conditional mixture models that includes a number of existing models and analyzed its capabilities and limitations. We apply the model to dynamical aspects of neuron pools. When Hebbian cell assemblies coexist in a pool of neurons, the condition is specified by these assemblies such that the probability distribution of spikes is a mixture of those of the component assemblies. The probabilities of activation of the Hebbian assemblies change dynamically. We used this model as a basis for a competitive model governing the states of assemblies.