Universal Estimation of Information Measures for Analog Sources
Foundations and Trends in Communications and Information Theory
Hi-index | 754.84 |
Expressions for (entropy-power inequality (EPI) Shannon type) divergence-power inequalities (DPIs) in two cases (time-discrete and time-continuous) of stationary random processes are given. The new expressions connect the divergence rate of the sum of independent processes, the individual divergence rate of each process, and their power spectral densities. All divergences are between a process and a Gaussian process with same second-order statistics, and are assumed to be finite. A new proof of the Shannon EPI, based on the relationship between divergence and causal minimum mean-square error (CMMSE) in Gaussian channels with large signal-to-noise ratio, is also shown