Elements of information theory
Elements of information theory
Signal Processing
Sampling and series expansion theorems for fractional Fourier and other transforms
Signal Processing - Special issue: Fractional signal processing and applications
Fractional quaternion Fourier transform, convolution and correlation
Signal Processing
The uncertainty principle: global, local, or both?
IEEE Transactions on Signal Processing
A unified framework for the fractional Fourier transform
IEEE Transactions on Signal Processing
The fractional Fourier transform and time-frequency representations
IEEE Transactions on Signal Processing
An uncertainty principle for real signals in the fractional Fouriertransform domain
IEEE Transactions on Signal Processing
Information theoretic inequalities
IEEE Transactions on Information Theory
On uncertainty principle for the linear canonical transform of complex signals
IEEE Transactions on Signal Processing
The fractional Fourier transform and quadratic field magnetic resonance imaging
Computers & Mathematics with Applications
Hi-index | 0.08 |
The entropic uncertainty principle is an element in information theory and plays an important role in signal processing. Based on the relations between the original function and the definition of the fractional Fourier transform (FRFT), two novel entropic uncertainty principles in FRFT domains, in which one is Shannon entropy uncertainty principle and the other is Renyi entropy uncertainty principle, are derived, which are associated with the FRFT parameters. In addition, the extended Renyi entropy uncertainty principle for multiple functions and discrete entropy uncertainty principle are explored as well. These inequalities disclose the relations between the bounds and the transform parameters and sampling periods.