Using simulated annealing to design good codes
IEEE Transactions on Information Theory
Simulated annealing: theory and applications
Simulated annealing: theory and applications
Elements of information theory
Elements of information theory
A survey of graph layout problems
ACM Computing Surveys (CSUR)
Note: On explicit formulas for bandwidth and antibandwidth of hypercubes
Discrete Applied Mathematics
Optimal binary index assignments for a class of equiprobable scalar and vector quantizers
IEEE Transactions on Information Theory - Part 2
IEEE Transactions on Information Theory
Soft decoding for vector quantization over noisy channels with memory
IEEE Transactions on Information Theory
Quantizers with uniform encoders and channel optimized decoders
IEEE Transactions on Information Theory
Monotonicity-based fast algorithms for MAP estimation of Markov sequences over noisy channels
IEEE Transactions on Information Theory
On the joint source-channel coding error exponent for discrete memoryless systems
IEEE Transactions on Information Theory
DPCM picture transmission over noisy channels with the aid of a Markov model
IEEE Transactions on Image Processing
Hi-index | 0.00 |
Channel-optimized quantizer index assignment and maximum a posteriori (MAP) decoding have been extensively studied for error-resilient communications. An interesting and largely untreated problem is how to optimize the index assignment with respect to joint source-channel MAP decoding. In this paper we formulate the above problem as one of quadratic assignment, and discuss its solutions from very general to some special cases. For highly correlated Gaussian Markov sources and Hamming distortion, we can construct the optimal index assignment analytically. For general cases, simulated annealing algorithm is adopted to search for the optimal index assignment. Experimental results are presented to demonstrate the performance improvement of the index assignments optimized for MAP decoding over those designed for hard-decision decoding (e.g. Gray code). The reduction of symbol error rate and mean squared error can be as large as 40% and 50% respectively for highly correlated Gaussian Markov sources.