Elements of information theory
Elements of information theory
The complexity and effectiveness of prediction algorithms
Journal of Complexity
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
Average Case Analysis of Algorithms on Sequences
Average Case Analysis of Algorithms on Sequences
Universal Compression and Retrieval
Universal Compression and Retrieval
Information Theory and Reliable Communication
Information Theory and Reliable Communication
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
On asymptotically optimal methods of prediction and adaptive coding for Markov sources
Journal of Complexity
Kolmogorov's Structure Functions with an Application to the Foundations of Model Selection
FOCS '02 Proceedings of the 43rd Symposium on Foundations of Computer Science
Universal Artificial Intelligence: Sequential Decisions Based On Algorithmic Probability
Universal Artificial Intelligence: Sequential Decisions Based On Algorithmic Probability
Algorithmic Clustering of Music Based on String Compression
Computer Music Journal
Superior Guarantees for Sequential Prediction and Lossless Compression via Alphabet Decomposition
The Journal of Machine Learning Research
Memory-universal prediction of stationary random processes
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Minimum description length induction, Bayesianism, and Kolmogorov complexity
IEEE Transactions on Information Theory
Grammar-based codes: a new class of universal lossless source codes
IEEE Transactions on Information Theory
On optimal sequential prediction for general processes
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
The context-tree weighting method: basic properties
IEEE Transactions on Information Theory
Hi-index | 0.00 |
We consider finite-alphabet and real-valued time series and thefollowing four problems: i) estimation of the (limiting)probability P(x_0 ... x_s) for every s and each sequence x_0 ...x_s of letters from the process alphabet (or estimation of thedensity p(x_0, ..., x_s) for real-valued time series), ii) theso-called on-line prediction, where the conditional probabilityP(x_{t+1}|...x_1x_2 ... x_t) (or the conditional densityP(x_{t+1}|x_1x_2 ... x_t)) should be estimated, where x_1x_2 ...x_t are given, iii) regression and iv) classification (or so-calledproblems with side information).We show that Kolmogorov complexity (KC) and universal codes (oruniversal data compressors), whose codeword length can beconsidered as an estimation of KC, can be used as a basis forconstructing asymptotically optimal methods for the above problems.(By definition, a universal code can "compress" any sequencegenerated by a stationary and ergodic source asymptotically to theShannon entropy of the source.)