Tracking a Small Set of Experts by Mixing Past Posteriors
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Tracking a small set of experts by mixing past posteriors
The Journal of Machine Learning Research
An optimal DNA segmentation based on the MDL principle
International Journal of Bioinformatics Research and Applications
Universal source controlled channel decoding with nonsystematic quick-look-in turbo codes
IEEE Transactions on Communications
IEEE Transactions on Signal Processing
Discrete denoising with shifts
IEEE Transactions on Information Theory
Universal randomized switching
IEEE Transactions on Signal Processing
Tracking the best level set in a level-crossing analog-to-digital converter
Digital Signal Processing
A closer look at adaptive regret
ALT'12 Proceedings of the 23rd international conference on Algorithmic Learning Theory
Hi-index | 754.90 |
Three strongly sequential, lossless compression schemes, one with linearly growing per-letter computational complexity, and two with fixed per-letter complexity, are presented and analyzed for memoryless sources with abruptly changing statistics. The first method, which improves on Willems' (1994) weighting approach, asymptotically achieves a lower bound on the redundancy, and hence is optimal. The second scheme achieves redundancy of O(log N/N) when the transitions in the statistics are large, and O (log log N/log N) otherwise. The third approach always achieves redundancy of O (√log N/N). Obviously, the two fixed complexity approaches can be easily combined to achieve the better redundancy between the two. Simulation results support the analytical bounds derived for all the coding schemes