On Relations Between the Relative entropy and $\chi^2$-Divergence, Generalizations and Applications

Tomohiro Nishiyama, Igal Sason

This paper is focused on a study of integral relations between the relative entropy and the chi-squared divergence, which are two fundamental divergence measures in information theory and statistics, a study of the implications of these relations, their information-theoretic applications, and some non-trivial generalizations pertaining to the rich class of $f$-divergences. Applications which are studied in this paper refer to lossless compression, the method of types and large deviations, strong data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment