ICASSP '97 Proceedings of the 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '97) -Volume 4 - Volume 4
Using 2: 1 Shannon Mapping for Joint Source-Channel Coding
DCC '05 Proceedings of the Data Compression Conference
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Nearest neighbor decoding for additive non-Gaussian noise channels
IEEE Transactions on Information Theory
On the role of mismatch in rate distortion theory
IEEE Transactions on Information Theory
Systematic lossy source/channel coding
IEEE Transactions on Information Theory
Design and performance of VQ-based hybrid digital-analog joint source-channel codes
IEEE Transactions on Information Theory
Hybrid digital-analog (HDA) joint source-channel codes for broadcasting and robust communications
IEEE Transactions on Information Theory
Curves on a sphere, shift-map dynamics, and error control for continuous alphabet sources
IEEE Transactions on Information Theory
Achieving 1/2 log (1+SNR) on the AWGN channel with lattice encoding and decoding
IEEE Transactions on Information Theory
Capacity and lattice strategies for canceling known interference
IEEE Transactions on Information Theory
Hybrid Digital–Analog Source–Channel Coding for Bandwidth Compression/Expansion
IEEE Transactions on Information Theory
Distortion Bounds for Broadcasting With Bandwidth Expansion
IEEE Transactions on Information Theory
Multiple Description Quantization Via Gram–Schmidt Orthogonalization
IEEE Transactions on Information Theory
On the Distortion SNR Exponent of Hybrid Digital–Analog Space–Time Coding
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Three hybrid digital-analog (HDA) systems, denoted by HDA-I, HDA* and HDA-II, for the coding of a memoryless discrete-time Gaussian source over a discrete-time additive memoryless Gaussian channel under bandwidth compression are studied. The systems employ simple linear coding in their analog component and superimpose their analog and digital signals before channel transmission. Information-theoretic upper bounds on the asymptotically optimal mean squared error distortion of the systems are obtained under both matched and mismatched channel conditions. Allocation schemes for distributing the channel input power between the analog and the digital signals are also examined. It is shown that systems HDA* and HDA-II can asymptotically achieve the optimal Shannon-limit performance under matched channel conditions. Low-complexity and low-delay versions of systems HDA-I and HDA-II are next designed and implemented without the use of error correcting codes. The parameters of these HDA systems, which employ vector quantization in conjunction with binary phase-shift keying modulation in their digital part, are optimized via an iterative algorithm similar to the design algorithm for channel-optimized vector quantizers. Both systems have low complexity and low delay, and guarantee graceful performance improvements for high CSNRs. For memoryless Gaussian sources the designed HDA-II system is shown to be superior to the HDA-I designed system. When applied to a Gauss-Markov source under Karhunen-Loeve processing, the HDA-I system is shown to provide considerably better performance.