Non-Stochastic Information Theory

Anshuka Rangi, Massimo Franceschetti

In an effort to develop the foundations for a non-stochastic theory of information, the notion of $\delta$-mutual information between uncertain variables is introduced as a generalization of Nair's non-stochastic information functional. Several properties of this new quantity are illustrated, and used to prove a channel coding theorem in a non-stochastic setting. Namely, it is shown that the largest $\delta$-mutual information between received and transmitted codewords over $\epsilon$-noise channels equals the $(\epsilon, \delta)$-capacity. This notion of capacity generalizes the Kolmogorov $\epsilon$-capacity to packing sets of overlap at most $\delta$, and is a variation of a previous definition proposed by one of the authors. Results are then extended to more general noise models, and to non-stochastic, memoryless, stationary channels. Finally, sufficient conditions are established for the factorization of the $\delta$-mutual information and to obtain a single letter capacity expression. Compared to previous non-stochastic approaches, the presented theory admits the possibility of decoding errors as in Shannon's probabilistic setting, while retaining a worst-case, non-stochastic character.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment