Invariance and universality of complexity

  • Authors:
  • Helmut Jürgensen

  • Affiliations:
  • Department of Computer Science, The University of Western Ontario, London, Ontario, Canada

  • Venue:
  • WTCS'12 Proceedings of the 2012 international conference on Theoretical Computer Science: computation, physics and beyond
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

The definition of descriptional complexity or algorithmic information in the sense of Kolmogorov or Chaitin is based on two important properties of computable functions, the existence of universal machines and the invariance under the choice of machine. Recently, the notion of descriptional complexity for finite-state computable functions has been introduced by Calude et al. For the latter theory, one cannot rely on the existence of universal machines, but bases the conclusions on an invariance theorem for finite transducers. This raises the question, which assumptions in algorithmic information theory are actually needed. We answer this question in a general setting, called encoded function space. Without any assumptions regarding encodings of functions and arguments and without any assumptions about computability or computing models, we introduce the notion of complexity. On this basis alone, a general invariance theorem is proved and sufficient conditions are stated for complexity to be computable. Next, universal functions are introduced, defined by pairing functions. It is shown that properties of the pairing functions, that is, of the joint encodings of functions and their inputs, determine the relation between the complexities measured according to different universal functions. In particular, without any other assumptions, for length-bounded or length-preserving pairing functions one can prove that complexity is independent of the choice of the universal function up to an additive constant. Some of the fundamental results of algorithmic information theory are obtained as corollaries.