In Nonequilibrium Thermodynamics and Information Theory, the relative entropy (or, KL divergence) plays a very important role. Consider a H\"older Jacobian $J$ and the Ruelle (transfer) operator $\mathcal{L}_{\log J}.$ Two equilibrium probabilities $\mu_1$ and $\mu_2$, can interact via a discrete-time {\it Thermodynamic Operation} described by the action {\it of the dual of the Ruelle operator} $ \mathcal{L}_{\log J}^*$. We argue that the law $\mu \to \mathcal{L}_{\log J}^*$, producing nonequilibrium, can be seen as a Thermodynamic Operation after showing that it's a manifestation of the Second Law of Thermodynamics. We also show that the change of relative entropy satisfies $$ D_{K L} (\mu_1,\mu_2) - D_{K L} (\mathcal{L}_{\log J}^*(\mu_1),\mathcal{L}_{\log J}^*(\mu_2))= 0.$$ Furthermore, we describe sufficient conditions on $J,\mu_1$ for getting $h(\mathcal{L}_{\log J}^*(\mu_1))\geq h(\mu_1)$, where $h$ is entropy. Recalling a natural Riemannian metric in the Banach manifold of H\"older equilibrium probabilities we exhibit the second-order Taylor formula for an infinitesimal tangent change of KL divergence; a crucial estimate in Information Geometry. We introduce concepts like heat, work, volume, pressure, and internal energy, which play here the role of the analogous ones in Thermodynamics of gases. We briefly describe the MaxEnt method.

Thanks. We have received your report. If we find this content to be in
violation of our guidelines,
we will remove it.

Ok