Building Sources of Zero Entropy: Rescaling and Inserting Delays
Résumé
Most of the natural sources that intervene in Information Theory have a positive entropy. They are well studied. The paper aims in building, in an explicit way, natural instances of sources with zero entropy. Such instances are obtained by slowing down sources of positive entropy, with processes which rescale sources or insert delays. These two processes-rescaling or inserting delays-are essentially the same; they do not change the fundamental intervals of the source, but only the "depth" at which they will be used, or the "speed" at which they are divided. However, they modify the entropy and lead to sources with zero entropy. The paper begins with a "starting" source of positive entropy, and uses a natural class of rescalings of sublinear type. In this way, it builds a class of sources of zero entropy that will be further analysed. As the starting sources possess well understood probabilistic properties, and as the process of rescaling does not change its fundamental intervals, the new sources keep the memory of some important probabilistic features of the initial source. Thus, these new sources may be thoroughly analysed, and their main probabilistic properties precisely described. We focus in particular on two important questions: exhibiting asymptotical normal behaviours à la Shannon-MacMillan-Breiman; analysing the depth of the tries built on the sources. In each case, we obtain a parameterized class of precise behaviours. The paper deals with the analytic combinatorics methodology and makes a great use of generating series.
Domaines
Mathématiques [math]Origine | Fichiers produits par l'(les) auteur(s) |
---|---|
licence |