On f-divergences between Cauchy distributions

01/29/2021
by   Frank Nielsen, et al.
0

We prove that the f-divergences between univariate Cauchy distributions are always symmetric and can be expressed as strictly increasing functions of the chi-squared divergence. We report the corresponding functions for the total variation distance, the Kullback-Leibler divergence, the LeCam-Vincze divergence, the squared Hellinger divergence, the Taneja divergence, and the Jensen-Shannon divergence. We then show that this symmetric f-divergence property does not hold anymore for multivariate Cauchy distributions. Finally, we present several metrizations of f-divergences between univariate Cauchy distributions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2018

A Short Note on the Jensen-Shannon Divergence between Simple Mixture Distributions

This short note presents results about the symmetric Jensen-Shannon dive...
research
10/03/2018

Sum decomposition of divergence into three divergences

Divergence functions play a key role as to measure the discrepancy betwe...
research
03/29/2013

On the symmetrical Kullback-Leibler Jeffreys centroids

Due to the success of the bag-of-word modeling paradigm, clustering hist...
research
09/05/2018

Bregman divergences based on optimal design criteria and simplicial measures of dispersion

In previous work the authors defined the k-th order simplicial distance ...
research
05/30/2019

Sinkhorn Barycenters with Free Support via Frank-Wolfe Algorithm

We present a novel algorithm to estimate the barycenter of arbitrary pro...
research
10/30/2015

Principal Differences Analysis: Interpretable Characterization of Differences between Distributions

We introduce principal differences analysis (PDA) for analyzing differen...
research
03/20/2019

Note on bounds for symmetric divergence measures

I. Sason obtained the tight bounds for symmetric divergence measures are...

Please sign up or login with your details

Forgot password? Click here to reset