A family of statistical symmetric divergences based on Jensen's inequality

09/21/2010
by   Frank Nielsen, et al.
0

We introduce a novel parametric family of symmetric information-theoretic distances based on Jensen's inequality for a convex functional generator. In particular, this family unifies the celebrated Jeffreys divergence with the Jensen-Shannon divergence when the Shannon entropy generator is chosen. We then design a generic algorithm to compute the unique centroid defined as the minimum average divergence. This yields a smooth family of centroids linking the Jeffreys to the Jensen-Shannon centroid. Finally, we report on our experimental results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro