Asymptotic Causal Inference

09/20/2021
by   Sridhar Mahadevan, et al.
0

We investigate causal inference in the asymptotic regime as the number of variables approaches infinity using an information-theoretic framework. We define structural entropy of a causal model in terms of its description complexity measured by the logarithmic growth rate, measured in bits, of all directed acyclic graphs (DAGs), parameterized by the edge density d. Structural entropy yields non-intuitive predictions. If we randomly sample a DAG from the space of all models, in the range d = (0, 1/8), almost surely the model is a two-layer DAG! Semantic entropy quantifies the reduction in entropy where edges are removed by causal intervention. Semantic causal entropy is defined as the f-divergence between the observational distribution and the interventional distribution P', where a subset S of edges are intervened on to determine their causal influence. We compare the decomposability properties of semantic entropy for different choices of f-divergences, including KL-divergence, squared Hellinger distance, and total variation distance. We apply our framework to generalize a recently popular bipartite experimental design for studying causal inference on large datasets, where interventions are carried out on one set of variables (e.g., power plants, items in an online store), but outcomes are measured on a disjoint set of variables (residents near power plants, or shoppers). We generalize bipartite designs to k-partite designs, and describe an optimization framework for finding the optimal k-level DAG architecture for any value of d ∈(0, 1/2). As edge density increases, a sequence of phase transitions occur over disjoint intervals of d, with deeper DAG architectures emerging for larger values of d. We also give a quantitative bound on the number of samples needed to reliably test for average causal influence for a k-partite design.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/10/2021

Entropic Causal Inference: Identifiability and Finite Sample Results

Entropic causal inference is a framework for inferring the causal direct...
research
06/05/2013

Structural Intervention Distance (SID) for Evaluating Causal Graphs

Causal inference relies on the structure of a graph, often a directed ac...
research
12/24/2017

Bayesian Nonparametric Causal Inference: Information Rates and Learning Algorithms

We investigate the problem of estimating the causal effect of a treatmen...
research
10/05/2020

Causal Inference with Bipartite Designs

Bipartite experiments are a recent object of study in causal inference, ...
research
07/01/2020

Quantifying causal contribution via structure preserving interventions

We introduce 'Causal Information Contribution (CIC)' and 'Causal Varianc...
research
06/01/2022

Bayesian sample size determination for causal discovery

Graphical models based on Directed Acyclic Graphs (DAGs) are widely used...

Please sign up or login with your details

Forgot password? Click here to reset