Estimating differential entropy using recursive copula splitting

11/14/2019
by   Gil Ariel, et al.
0

A method for estimating the Shannon differential entropy of multidimensional random variables using independent samples is described. The method is based on decomposing the distribution into a product of the marginal distributions and the joint dependency, also known as the copula. The entropy of marginals is estimated using one-dimensional methods. The entropy of the copula, which always has a compact support, is estimated recursively by splitting the data along statistically dependent dimensions. Numerical examples demonstrate that the method is accurate for distributions with compact and non-compact supports, which is imperative when the support is not known or of mixed type (in different dimensions). At high dimensions (larger than 20), our method is not only more accurate, but also significantly more efficient than existing approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2018

Estimating Differential Entropy under Gaussian Convolutions

This paper studies the problem of estimating the differential entropy h(...
research
10/06/2022

The Shannon Entropy of a Histogram

The histogram is a key method for visualizing data and estimating the un...
research
06/23/2021

An Effective Bernstein-type Bound on Shannon Entropy over Countably Infinite Alphabets

We prove a Bernstein-type bound for the difference between the average o...
research
04/04/2022

Estimating the Entropy of Linguistic Distributions

Shannon entropy is often a quantity of interest to linguists studying th...
research
02/07/2020

On the Estimation of Information Measures of Continuous Distributions

The estimation of information measures of continuous distributions based...
research
09/19/2018

Using Eigencentrality to Estimate Joint, Conditional and Marginal Probabilities from Mixed-Variable Data: Method and Applications

The ability to estimate joint, conditional and marginal probability dist...
research
05/24/2020

Sharp variance-entropy comparison for nonnegative gaussian quadratic forms

In this article we study quadratic forms in n independent standard norma...

Please sign up or login with your details

Forgot password? Click here to reset