Algorithmic information theory
Algorithmic information theory
A Theory of Program Size Formally Identical to Information Theory
Journal of the ACM (JACM)
Computable analysis: an introduction
Computable analysis: an introduction
Visualization 2001 Conference (Acm
Visualization 2001 Conference (Acm
Natural halting probabilities, partial randomness, and zeta functions
Information and Computation
A Chaitin $$\Upomega$$ number based on compressible strings
Natural Computing: an international journal
Introduction: computability of the physical
Mathematical Structures in Computer Science
Hi-index | 0.00 |
In our former work [K. Tadaki, Local Proceedings of CiE 2008,pp. 425---434, 2008], we developed a statistical mechanicalinterpretation of algorithmic information theory by introducing thenotion of thermodynamic quantities, such as free energyF (T ), energy E (T ), andstatistical mechanical entropy S (T ), into thetheory. We then discovered that, in the interpretation, thetemperature T equals to the partial randomness of thevalues of all these thermodynamic quantities, where the notion ofpartial randomness is a stronger representation of the compressionrate by program-size complexity. Furthermore, we showed that thissituation holds for the temperature itself as a thermodynamicquantity. Namely, the computability of the value of partitionfunction Z (T ) gives a sufficient condition forT ⊆ (0,1) to be a fixed point on partial randomness.In this paper, we show that the computability of each of all thethermodynamic quantities above gives the sufficient condition also.Moreover, we show that the computability of F (T )gives completely different fixed points from the computability ofZ (T ).