Diffeomorphic Information Neural Estimation

11/20/2022
by   Bao Duong, et al.
0

Mutual Information (MI) and Conditional Mutual Information (CMI) are multi-purpose tools from information theory that are able to naturally measure the statistical dependencies between random variables, thus they are usually of central interest in several statistical and machine learning tasks, such as conditional independence testing and representation learning. However, estimating CMI, or even MI, is infamously challenging due the intractable formulation. In this study, we introduce DINE (Diffeomorphic Information Neural Estimator)-a novel approach for estimating CMI of continuous random variables, inspired by the invariance of CMI over diffeomorphic maps. We show that the variables of interest can be replaced with appropriate surrogates that follow simpler distributions, allowing the CMI to be efficiently evaluated via analytical solutions. Additionally, we demonstrate the quality of the proposed estimator in comparison with state-of-the-arts in three important tasks, including estimating MI, CMI, as well as its application in conditional independence testing. The empirical evaluations show that DINE consistently outperforms competitors in all tasks and is able to adapt very well to complex and high-dimensional relationships.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/27/2018

On mutual information estimation for mixed-pair random variables

We study the mutual information estimation for mixed-pair random variabl...
research
09/04/2022

Conditional Independence Testing via Latent Representation Learning

Detecting conditional independencies plays a key role in several statist...
research
11/09/2020

Estimating Total Correlation with Mutual Information Bounds

Total correlation (TC) is a fundamental concept in information theory to...
research
01/13/2021

Estimating Conditional Mutual Information for Discrete-Continuous Mixtures using Multi-Dimensional Adaptive Histograms

Estimating conditional mutual information (CMI) is an essential yet chal...
research
02/14/2022

KNIFE: Kernelized-Neural Differential Entropy Estimation

Mutual Information (MI) has been widely used as a loss regularizer for t...
research
07/10/2023

Information decomposition to identify relevant variation in complex systems with machine learning

One of the fundamental steps toward understanding a complex system is id...
research
10/26/2021

Estimating Mutual Information via Geodesic kNN

Estimating mutual information (MI) between two continuous random variabl...

Please sign up or login with your details

Forgot password? Click here to reset