Elements of information theory
Elements of information theory
The complexity and effectiveness of prediction algorithms
Journal of Complexity
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
Average Case Analysis of Algorithms on Sequences
Average Case Analysis of Algorithms on Sequences
Universal Compression and Retrieval
Universal Compression and Retrieval
Information Theory and Reliable Communication
Information Theory and Reliable Communication
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
On asymptotically optimal methods of prediction and adaptive coding for Markov sources
Journal of Complexity
Kolmogorov's Structure Functions with an Application to the Foundations of Model Selection
FOCS '02 Proceedings of the 43rd Symposium on Foundations of Computer Science
Algorithmic Clustering of Music Based on String Compression
Computer Music Journal
Superior Guarantees for Sequential Prediction and Lossless Compression via Alphabet Decomposition
The Journal of Machine Learning Research
Generalized kraft inequality and arithmetic coding
IBM Journal of Research and Development
Universal Artificial Intelligence: Sequential Decisions Based on Algorithmic Probability
Universal Artificial Intelligence: Sequential Decisions Based on Algorithmic Probability
Paper: Modeling by shortest data description
Automatica (Journal of IFAC)
Memory-universal prediction of stationary random processes
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Minimum description length induction, Bayesianism, and Kolmogorov complexity
IEEE Transactions on Information Theory
Grammar-based codes: a new class of universal lossless source codes
IEEE Transactions on Information Theory
Universal lossless source coding with the Burrows Wheeler transform
IEEE Transactions on Information Theory
Universal coding, information, prediction, and estimation
IEEE Transactions on Information Theory
On optimal sequential prediction for general processes
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
The context-tree weighting method: basic properties
IEEE Transactions on Information Theory
Hi-index | 0.00 |
We consider finite-alphabet and real-valued time series and the following four problems: i) estimation of the (limiting) probability P(x$_0$ … x$_s$) for every s and each sequence x$_0$ … x$_s$ of letters from the process alphabet (or estimation of the density p(x$_0$, …, x$_s$) for real-valued time series), ii) the so-called on-line prediction, where the conditional probability P(x$_{t+1}$∣x$_1$x$_2$ … x$_t$) (or the conditional density P(x$_{t+1}$∣x$_1$x$_2$ … x$_t$)) should be estimated, where x$_1$x$_2$ … x$_t$ are given, iii) regression and iv) classification (or so-called problems with side information). We show that Kolmogorov complexity (KC) and universal codes (or universal data compressors), whose codeword length can be considered as an estimation of KC, can be used as a basis for constructing asymptotically optimal methods for the above problems. (By definition, a universal code can "compress" any sequence generated by a stationary and ergodic source asymptotically to the Shannon entropy of the source.)