Information-Theoretic Approximation to Causal Models

07/29/2020
by   Peter Gmeiner, et al.
0

Inferring the causal direction and causal effect between two discrete random variables X and Y from a finite sample is often a crucial problem and a challenging task. However, if we have access to observational and interventional data, it is possible to solve that task. If X is causing Y, then it does not matter if we observe an effect in Y by observing changes in X or by intervening actively on X. This invariance principle creates a link between observational and interventional distributions in a higher dimensional probability space. We embed distributions that originate from samples of X and Y into that higher dimensional space such that the embedded distribution is closest to the distributions that follow the invariance principle, with respect to the relative entropy. This allows us to calculate the best information-theoretic approximation for a given empirical distribution, that follows an assumed underlying causal model. We show that this information-theoretic approximation to causal models (IACM) can be done by solving a linear optimization problem. In particular, by approximating the empirical distribution to a monotonic causal model, we can calculate probabilities of causation. It turns out that this approximation approach can be used to successfully solve causal discovery problems in the bivariate, discrete case. Experimental results on both labeled synthetic and real-world data demonstrate that our approach outperforms other state-of-the-art approaches in the discrete case with low cardinality.

READ FULL TEXT
research
09/26/2017

Telling Cause from Effect using MDL-based Local and Global Regression

We consider the fundamental problem of inferring the causal direction be...
research
06/05/2018

A Primer on Causal Analysis

We provide a conceptual map to navigate causal analysis problems. Focusi...
research
11/02/2009

Causal Inference on Discrete Data using Additive Noise Models

Inferring the causal structure of a set of random variables from a finit...
research
02/21/2017

Causal Inference on Multivariate and Mixed-Type Data

Given data over the joint distribution of two random variables X and Y, ...
research
04/11/2022

Information in probability: Another information-theoretic proof of a finite de Finetti theorem

We recall some of the history of the information-theoretic approach to d...
research
06/05/2021

On the Role of Entropy-based Loss for Learning Causal Structures with Continuous Optimization

Causal discovery from observational data is an important but challenging...
research
08/10/2018

Genome-Wide Association Studies: Information Theoretic Limits of Reliable Learning

In the problems of Genome-Wide Association Study (GWAS), the objective i...

Please sign up or login with your details

Forgot password? Click here to reset