Edgeworth Approximation of Multivariate Differential Entropy
Neural Computation
Some Equivalences between Kernel Methods and Information Theoretic Methods
Journal of VLSI Signal Processing Systems
Blind source separation applied to spectral unmixing: comparing different measures of nongaussianity
KES'07/WIRN'07 Proceedings of the 11th international conference, KES 2007 and XVII Italian workshop on neural networks conference on Knowledge-based intelligent information and engineering systems: Part III
A general criterion for analog Tx-Rx beamforming under OFDM transmissions
IEEE Transactions on Signal Processing
6DOF entropy minimization SLAM for stereo-based wearable devices
Computer Vision and Image Understanding
Minimum entropy control for stochastic systems based on the wavelet neural networks
ISNN'06 Proceedings of the Third international conference on Advnaces in Neural Networks - Volume Part II
Hi-index | 35.69 |
Blind deconvolution of linear channels is a fundamental signal processing problem that has immediate extensions to multiple-channel applications. In this paper, we investigate the suitability of a class of Parzen-window-based entropy estimates, namely Renyi's entropy, as a criterion for blind deconvolution of linear channels. Comparisons between maximum and minimum entropy approaches, as well as the effect of entropy order, equalizer length, sample size, and measurement noise on performance, will be investigated through Monte Carlo simulations. The results indicate that this nonparametric entropy estimation approach outperforms the standard Bell-Sejnowski and normalized kurtosis algorithms in blind deconvolution. In addition, the solutions using Shannon's entropy were not optimal either for super- or sub-Gaussian source densities.