Matrix analysis
A new entropy power inequality
IEEE Transactions on Information Theory
Mathematical aspects of the relative gain array (AΦHA—T)
SIAM Journal on Algebraic and Discrete Methods
Topics in matrix analysis
Elements of information theory
Elements of information theory
Fundamentals of statistical signal processing: estimation theory
Fundamentals of statistical signal processing: estimation theory
IEEE Transactions on Signal Processing
A short proof of the “concavity of entropy power”
IEEE Transactions on Information Theory
Second-order asymptotics of mutual information
IEEE Transactions on Information Theory
Mutual information and minimum mean-square error in Gaussian channels
IEEE Transactions on Information Theory
On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel
IEEE Transactions on Information Theory
Gradient of mutual information in linear vector Gaussian channels
IEEE Transactions on Information Theory
Optimum power allocation for parallel Gaussian channels with arbitrary input distributions
IEEE Transactions on Information Theory
Representation of Mutual Information Via Input Estimates
IEEE Transactions on Information Theory
A vector generalization of Costa entropy-power inequality and applications
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Linear precoding for mutual information maximization in MIMO systems
ISWCS'09 Proceedings of the 6th international conference on Symposium on Wireless Communication Systems
A vector generalization of costa's entropy-power inequality with applications
IEEE Transactions on Information Theory
A unified treatment of optimum pilot overhead in multipath fading channels
IEEE Transactions on Communications
The geometry of fusion inspired channel design
Signal Processing
Hi-index | 754.90 |
Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum mean square error and Fisher information matrices were linked to information-theoretic quantities through differentiation, the Hessian of the mutual information and the entropy are derived. These expressions are then used to assess the concavity properties of mutual information and entropy under different channel conditions and also to derive a multivariate version of an entropy power inequality due to Costa.