Lossless compression of map contours by context tree modeling of chain codes
Pattern Recognition
Context-based entropy coding in AVS video coding standard
Image Communication
Modeling partial customer churn: On the value of first product-category purchase sequences
Expert Systems with Applications: An International Journal
Feature reinforcement learning in practice
EWRL'11 Proceedings of the 9th European conference on Recent Advances in Reinforcement Learning
Variable length local decoding and alignment-free sequence comparison
Theoretical Computer Science
Personalized news recommendation with context trees
Proceedings of the 7th ACM conference on Recommender systems
Online portfolio selection: A survey
ACM Computing Surveys (CSUR)
Hi-index | 754.84 |
A universal data compression algorithm is described which is capable of compressing long strings generated by a "finitely generated" source, with a near optimum per symbol length without prior knowledge of the source. This class of sources may be viewed as a generalization of Markov sources to random fields. Moreover, the algorithm does not require a working storage much larger than that needed to describe the source generating parameters.