Principles and practice of information theory
Principles and practice of information theory
New approximations of differential entropy for independent component analysis and projection pursuit
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Detection of Signals in Noise
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
A Mathematical Theory of Communication
A Mathematical Theory of Communication
Approximate information content of a signal with bispectral constraints
IEEE Transactions on Signal Processing
Transient signal detection using higher order moments
IEEE Transactions on Signal Processing
Blind separation of instantaneous mixture of sources based on orderstatistics
IEEE Transactions on Signal Processing
Estimating the entropy of a signal with applications
IEEE Transactions on Signal Processing
Detection of non-Gaussian signals using integrated polyspectrum
IEEE Transactions on Signal Processing
Probability density function estimation using the MinMax measure
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
IEEE Transactions on Information Theory
Entropy and data compression schemes
IEEE Transactions on Information Theory
Hi-index | 0.08 |
The information content of signals subject to constraints such as the total power at a set of frequencies is well understood. This paper illustrates a general approach to the computation of the entropy of a constrained signal. Although general solutions will be derived, exact solutions may be difficult and so three approximation techniques are discussed. One is based on the cumulant expansion. The lowest-order results from this expansion have a simple form. The use of an asymptotic expression to compute the signal entropy is explored when the constraints are such that the number of states available to the signal is small. The role of direct numerical calculations is discussed. Results are discussed in light of the problem of computing the entropy of a signal constrained by its moments. The lowest-order terms in the cumulant expansion of the entropy are used to find a generalized correlation filter to detect signals subject to different constraints. The critical role that entropy has in determining the sensitivity and specificity of such a detector is discussed. Another application is the computation of the entropy of a signal subject to polyspectral constraints. It is demonstrated that the entropy change is governed by the higher-order coherence. Applications of these findings to the detection of signals subject to polyspectral constraints is discussed as are the limitations of the second-order cumulant expansion. More general expressions for the entropy of such systems are derived.