Universal Densities Exist for Every Finite Reference Measure

24 Sep 2022  ·  Łukasz Dębowski ·

As it is known, universal codes, which estimate the entropy rate consistently, exist for stationary ergodic sources over finite alphabets but not over countably infinite ones. We generalize universal coding as the problem of universal densities with respect to a fixed reference measure on a countably generated measurable space. We show that universal densities, which estimate the differential entropy rate consistently, exist for finite reference measures. Thus finite alphabets are not necessary in some sense. To exhibit a universal density, we adapt the non-parametric differential (NPD) entropy rate estimator by Feutrill and Roughan. Our modification is analogous to Ryabko's modification of prediction by partial matching (PPM) by Cleary and Witten. Whereas Ryabko considered a mixture over Markov orders, we consider a mixture over quantization levels. Moreover, we demonstrate that any universal density induces a strongly consistent Ces\`aro mean estimator of conditional density given an infinite past. This yields a universal predictor with the $0-1$ loss for a countable alphabet. Finally, we specialize universal densities to processes over natural numbers and on the real line. We derive sufficient conditions for consistent estimation of the entropy rate with respect to infinite reference measures in these domains.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Information Theory Information Theory Statistics Theory Statistics Theory 94A29 (Primary) 62M20 (Secondary)

Datasets


  Add Datasets introduced or used in this paper