The entropy of a finite probability space $X$ measures the observable cardinality of large independent products $X^{\otimes n}$ of the probability space. If two probability spaces $X$ and $Y$ have the same entropy, there is an almost measure-preserving bijection between large parts of $X^{\otimes n}$ and $Y^{\otimes n}$. In this way, $X$ and $Y$ are asymptotically equivalent. It turns out to be challenging to generalize this notion of asymptotic equivalence to configurations of probability spaces, which are collections of probability spaces with measure-preserving maps between some of them. In this article we introduce the intrinsic Kolmogorov-Sinai distance on the space of configurations of probability spaces. Concentrating on the large-scale geometry we pass to the asymptotic Kolmogorov-Sinai distance. It induces an asymptotic equivalence relation on sequences of configurations of probability spaces. We will call the equivalence classes \emph{tropical probability spaces}. In this context we prove an Asymptotic Equipartition Property for configurations. It states that tropical configurations can always be approximated by homogeneous configurations. In addition, we show that the solutions to certain Information-Optimization problems are Lipschitz-con\-tinuous with respect to the asymptotic Kolmogorov-Sinai distance. It follows from these two statements that in order to solve an Information-Optimization problem, it suffices to consider homogeneous configurations. Finally, we show that spaces of trajectories of length $n$ of certain stochastic processes, in particular stationary Markov chains, have a tropical limit.

Thanks. We have received your report. If we find this content to be in
violation of our guidelines,
we will remove it.

Ok