Estimators for Multivariate Information Measures in General Probability Spaces

10/26/2018
by   Arman Rahimzamani, et al.
0

Information theoretic quantities play an important role in various settings in machine learning, including causality testing, structure inference in graphical models, time-series problems, feature selection as well as in providing privacy guarantees. A key quantity of interest is the mutual information and generalizations thereof, including conditional mutual information, multivariate mutual information, total correlation and directed information. While the aforementioned information quantities are well defined in arbitrary probability spaces, existing estimators add or subtract entropies (we term them Σ H methods). These methods work only in purely discrete space or purely continuous case since entropy (or differential entropy) is well defined only in that regime. In this paper, we define a general graph divergence measure (GDM), as a measure of incompatibility between the observed distribution and a given graphical model structure. This generalizes the aforementioned information measures and we construct a novel estimator via a coupling trick that directly estimates these multivariate information measures using the Radon-Nikodym derivative. These estimators are proven to be consistent in a general setting which includes several cases where the existing estimators fail, thus providing the only known estimators for the following settings: (1) the data has some discrete and some continuous-valued components (2) some (or all) of the components themselves are discrete-continuous mixtures (3) the data is real-valued but does not have a joint density on the entire space, rather is supported on a low-dimensional manifold. We show that our proposed estimators significantly outperform known estimators on synthetic and real datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2017

Potential Conditional Mutual Information: Estimators, Properties and Applications

The conditional mutual information I(X;Y|Z) measures the average informa...
research
06/19/2023

Beyond Normal: On the Evaluation of Mutual Information Estimators

Mutual information is a general statistical dependency measure which has...
research
08/23/2018

Multivariate Extension of Matrix-based Renyi's α-order Entropy Functional

The matrix-based Renyi's α-order entropy functional was recently introdu...
research
07/27/2018

IDTxl: The Information Dynamics Toolkit xl: a Python package for the efficient analysis of multivariate information dynamics in networks

The Information Dynamics Toolkit xl (IDTxl) is a comprehensive software ...
research
09/13/2016

Information Theoretic Structure Learning with Confidence

Information theoretic measures (e.g. the Kullback Liebler divergence and...
research
10/08/2020

Information Theory Measures via Multidimensional Gaussianization

Information theory is an outstanding framework to measure uncertainty, d...
research
02/24/2022

Estimators of Entropy and Information via Inference in Probabilistic Models

Estimating information-theoretic quantities such as entropy and mutual i...

Please sign up or login with your details

Forgot password? Click here to reset