COLT '90 Proceedings of the third annual workshop on Computational learning theory
Learning probabilistic prediction functions
COLT '88 Proceedings of the first annual workshop on Computational learning theory
The weighted majority algorithm
Information and Computation
Journal of the ACM (JACM)
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
A game of prediction with expert advice
Journal of Computer and System Sciences - Special issue on the eighth annual workshop on computational learning theory, July 5–8, 1995
Linear relations between square-loss and Kolmogorov complexity
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Probability Theory for the Brier Game
ALT '97 Proceedings of the 8th International Conference on Algorithmic Learning Theory
TIGHT WORST-CASE LOSS BOUNDS FOR PREDICTING WITH EXPERT ADVICE
TIGHT WORST-CASE LOSS BOUNDS FOR PREDICTING WITH EXPERT ADVICE
Non-linear Inequalities between Predictive and Kolmogorov Complexities
ALT '01 Proceedings of the 12th International Conference on Algorithmic Learning Theory
Loss Functions, Complexities, and the Legendre Transformation
ALT '01 Proceedings of the 12th International Conference on Algorithmic Learning Theory
Predictive Complexity and Information
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Hi-index | 0.00 |
In this paper we introduce a general method that allows to prove tight linear inequalities between different types of predictive complexity and thus we generalise our previous results. The method relies upon probabilistic considerations and allows to describe (using geometrical terms) the sets of coefficients which correspond to true inequalities. We also apply this method to the square-loss and logarithmic complexity and describe their relations which were not covered by our previous research.