On Relations Between the Relative entropy and χ^2–Divergence, Generalizations and Applications
This paper is focused on a study of integral relations between the relative entropy and the chi–squared divergence, which are two fundamental divergence measures in information theory and statistics, a study of the implications of these relations, their information–theoretic applications, and some non–trivial generalizations pertaining to the rich class of f–divergences. Applications which are studied in this paper refer to lossless compression, the method of types and large deviations, strong data–processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete–time Markov chains.
READ FULL TEXT