Generalised entropy and asymptotic complexities of languages

  • Authors:
  • Yuri Kalnishkan;Vladimir Vovk;Michael V. Vyugin

  • Affiliations:
  • Department of Computer Science, Royal Holloway, University of London, Egham, Surrey, UK;Department of Computer Science, Royal Holloway, University of London, Egham, Surrey, UK;Department of Computer Science, Royal Holloway, University of London, Egham, Surrey, UK

  • Venue:
  • COLT'07 Proceedings of the 20th annual conference on Learning theory
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper the concept of asymptotic complexity of languages is introduced. This concept formalises the notion of learnability in a particular environment and generalises Lutz and Fortnow's concepts of predictability and dimension. Then asymptotic complexities in different prediction environments are compared by describing the set of all pairs of asymptotic complexities w.r.t. different environments. A geometric characterisation in terms of generalised entropies is obtained and thus the results of Lutz and Fortnow are generalised.