Shannon, Turing and Hats: Information Theory Incompleteness - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2017

Shannon, Turing and Hats: Information Theory Incompleteness

Résumé

When Claude Shannon introduced his theory of information in 1948, he was mainly targeting potentially infinite sequences of symbols, e.g. finite sequences whose length n were tending to infinity. Parameters like information rate, error correction, compress-ibility and predictability are defined for finite n with interesting properties when n tends to infinity. Here we consider that the sequence of events is truly infinite (past and future). Manipulate truly infinite sequences is not exactly the same as manipulating potentially infinite sequences. Prediction can be related to infinite hat puzzles, so that if we introduce the axiom of choice in information theory over infinite sequences, then the number of prediction errors on an infinite sequence is finite. This implies that the prediction error rate is actually zero. This is a rather surprising result since the infinite sequence could have been generated by a memoryless source. Moreover, if the infinite sequence is a computable sequence toward the past, then we get the zero error rate result without the axiom of choice. This is also a surprising result since a computable sequence is not necessarily computable toward the future, since a Turing machine is not necessarily reversible.
turing_vs_shannon.pdf (3.2 Mo) Télécharger le fichier

Dates et versions

hal-01675019 , version 1 (04-01-2018)

Identifiants

  • HAL Id : hal-01675019 , version 1

Citer

François Durand, Fabien Mathieu, Philippe Jacquet. Shannon, Turing and Hats: Information Theory Incompleteness. WITMSE 2017 - Tenth Workshop on Information Theoretic Methods in Science and Engineering, Sep 2017, Paris, France. ⟨hal-01675019⟩
302 Consultations
66 Téléchargements

Partager

Gmail Facebook X LinkedIn More