Relation between the Kantorovich-Wasserstein metric and the Kullback-Leibler divergence

08/24/2019
by   Roman V. Belavkin, et al.
0

We discuss a relation between the Kantorovich-Wasserstein (KW) metric and the Kullback-Leibler (KL) divergence. The former is defined using the optimal transport problem (OTP) in the Kantorovich formulation. The latter is used to define entropy and mutual information, which appear in variational problems to find optimal channel (OCP) from the rate distortion and the value of information theories. We show that OTP is equivalent to OCP with one additional constraint fixing the output measure, and therefore OCP with constraints on the KL-divergence gives a lower bound on the KW-metric. The dual formulation of OTP allows us to explore the relation between the KL-divergence and the KW-metric using decomposition of the former based on the law of cosines. This way we show the link between two divergences using the variational and geometric principles.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2019

A New Lower Bound for Kullback-Leibler Divergence Based on Hammersley-Chapman-Robbins Bound

In this paper, we derive a useful lower bound for the Kullback-Leibler d...
research
09/26/2014

Generalized Twin Gaussian Processes using Sharma-Mittal Divergence

There has been a growing interest in mutual information measures due to ...
research
05/09/2018

Description of a Tracking Metric Inspired by KL-divergence

A unified metric is given for the evaluation of tracking systems. The me...
research
03/15/2021

Nonequilibrium in Thermodynamic Formalism: the Second Law, gases and Information Geometry

In Nonequilibrium Thermodynamics and Information Theory, the relative en...
research
08/16/2022

An Optimal Transport Approach to the Computation of the LM Rate

Mismatch capacity characterizes the highest information rate for a chann...
research
10/26/2022

Quantifying the Loss of Acyclic Join Dependencies

Acyclic schemas possess known benefits for database design, speeding up ...
research
02/26/2018

Principles of Bayesian Inference using General Divergence Criteria

When it is acknowledged that all candidate parameterised statistical mod...

Please sign up or login with your details

Forgot password? Click here to reset