Elements of information theory
Elements of information theory
Spikes: exploring the neural code
Spikes: exploring the neural code
Estimation of entropy and mutual information
Neural Computation
Estimating Entropy Rates with Bayesian Confidence Intervals
Neural Computation
On the distribution function of the complexity of finite sequences
Information Sciences: an International Journal
The computational structure of spike trains
Neural Computation
A tunable support vector machine assembly classifier for epileptic seizure detection
Expert Systems with Applications: An International Journal
The lempel-ziv complexity of fixed points of morphisms
MFCS'06 Proceedings of the 31st international conference on Mathematical Foundations of Computer Science
Multi-user wireless channel probing for shared key generation with a fuzzy controller
Computer Networks: The International Journal of Computer and Telecommunications Networking
Hi-index | 0.00 |
Normalized Lempel-Ziv complexity, which measures the generation rate of new patterns along a digital sequence, is closely related to such important source properties as entropy and compression ratio, but, in contrast to these, it is a property of individual sequences. In this article, we propose to exploit this concept to estimate (or, at least, to bound from below) the entropy of neural discharges (spike trains). The main advantages of this method include fast convergence of the estimator (as supported by numerical simulation) and the fact that there is no need to know the probability law of the process generating the signal. Furthermore, we present numerical and experimental comparisons of the new method against the standard method based on word frequencies, providing evidence that this new approach is an alternative entropy estimator for binned spike trains.