Independent Gaussian Distributions Minimize the Kullback-Leibler (KL) Divergence from Independent Gaussian Distributions

11/04/2020
βˆ™
by   Song Fang, et al.
βˆ™
0
βˆ™

This short note is on a property of the Kullback-Leibler (KL) divergence which indicates that independent Gaussian distributions minimize the KL divergence from given independent Gaussian distributions. The primary purpose of this note is for the referencing of papers that need to make use of this property entirely or partially.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
βˆ™ 12/07/2020

Independent Elliptical Distributions Minimize Their 𝒲_2 Wasserstein Distance from Independent Elliptical Distributions with the Same Density Generator

This short note is on a property of the 𝒲_2 Wasserstein distance which i...
research
βˆ™ 05/13/2021

Empirical Evaluation of Biased Methods for Alpha Divergence Minimization

In this paper we empirically evaluate biased methods for alpha-divergenc...
research
βˆ™ 02/28/2022

KL Divergence Estimation with Multi-group Attribution

Estimating the Kullback-Leibler (KL) divergence between two distribution...
research
βˆ™ 08/23/2020

Learn to Talk via Proactive Knowledge Transfer

Knowledge Transfer has been applied in solving a wide variety of problem...
research
βˆ™ 09/15/2021

How to use KL-divergence to construct conjugate priors, with well-defined non-informative limits, for the multivariate Gaussian

The Wishart distribution is the standard conjugate prior for the precisi...
research
βˆ™ 05/18/2016

The Quality of the Covariance Selection Through Detection Problem and AUC Bounds

We consider the problem of quantifying the quality of a model selection ...
research
βˆ™ 05/09/2018

Description of a Tracking Metric Inspired by KL-divergence

A unified metric is given for the evaluation of tracking systems. The me...

Please sign up or login with your details

Forgot password? Click here to reset