An association-based dissimilarity measure for categorical data
Pattern Recognition Letters
An introduction to symbolic data analysis and the SODAS software
Intelligent Data Analysis
Some properties of Rényi entropy and Rényi entropy rate
Information Sciences: an International Journal
Classification of symbolic objects: A lazy learning approach
Intelligent Data Analysis - Analysis of Symbolic and Spatial Data
Hi-index | 754.84 |
In this work, we examine the existence and the computation of the Renyi divergence rate, limn→∞ 1/n Dα (p(n)||q(n)), between two time-invariant finite-alphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions p(n) and q(n), respectively. This yields a generalization of a result of Nemetz (1974) where he assumed that the initial probabilities under p(n) and q(n) are strictly positive. The main tools used to obtain the Renyi divergence rate are the theory of nonnegative matrices and Perron-Frobenius theory. We also provide numerical examples and investigate the limits of the Renyi divergence rate as α→1 and as α↓0. Similarly, we provide a formula for the Renyi entropy rate limn→∞ 1/n H α(p(n)) of Markov sources and examine its limits as α→1 and as α↓0. Finally, we briefly provide an application to source coding