Elements of information theory
Elements of information theory
Dynamic Programming and Optimal Control
Dynamic Programming and Optimal Control
Journal of Cognitive Neuroscience
A Method for Selecting the Bin Size of a Time Histogram
Neural Computation
Estimating entropy on m bins given fewer than m samples
IEEE Transactions on Information Theory
Bayesian bin distribution inference and mutual information
IEEE Transactions on Information Theory
Emulating human observers with bayesian binning: Segmentation of action streams
ACM Transactions on Applied Perception (TAP)
Segmentation of action streams human observers vs. Bayesian binning
KI'11 Proceedings of the 34th Annual German conference on Advances in artificial intelligence
Optimizing time histograms for non-poissonian spike trains
Neural Computation
Random bin for analyzing neuron spike trains
Computational Intelligence and Neuroscience - Special issue on Computational Intelligence in Biomedical Science and Engineering
An overview of bayesian methods for neural spike train analysis
Computational Intelligence and Neuroscience - Special issue on Modeling and Analysis of Neural Spike Trains
Hi-index | 0.00 |
The peristimulus time histogram (PSTH) and its more continuous cousin, the spike density function (SDF) are staples in the analytic toolkit of neurophysiologists. The former is usually obtained by binning spike trains, whereas the standard method for the latter is smoothing with a Gaussian kernel. Selection of a bin width or a kernel size is often done in an relatively arbitrary fashion, even though there have been recent attempts to remedy this situation (DiMatteo et聽al., Biometrika 88(4):1055---1071, 2001; Shimazaki and Shinomoto 2007a, Neural Comput 19(6):1503---1527, 2007b, c; Cunningham et聽al. 2008). We develop an exact Bayesian, generative model approach to estimating PSTHs. Advantages of our scheme include automatic complexity control and error bars on its predictions. We show how to perform feature extraction on spike trains in a principled way, exemplified through latency and firing rate posterior distribution evaluations on repeated and single trial data. We also demonstrate using both simulated and real neuronal data that our approach provides a more accurate estimates of the PSTH and the latency than current competing methods. We employ the posterior distributions for an information theoretic analysis of the neural code comprised of latency and firing rate of neurons in high-level visual area STSa. A software implementation of our method is available at the machine learning open source software repository ( www.mloss.org , project `binsdfc').