Differentiable DAG Sampling

03/16/2022
by   Bertrand Charpentier, et al.
0

We propose a new differentiable probabilistic model over DAGs (DP-DAG). DP-DAG allows fast and differentiable DAG sampling suited to continuous optimization. To this end, DP-DAG samples a DAG by successively (1) sampling a linear ordering of the node and (2) sampling edges consistent with the sampled linear ordering. We further propose VI-DP-DAG, a new method for DAG learning from observational data which combines DP-DAG with variational inference. Hence,VI-DP-DAG approximates the posterior probability over DAG edges given the observed data. VI-DP-DAG is guaranteed to output a valid DAG at any time during training and does not require any complex augmented Lagrangian optimization scheme in contrast to existing differentiable DAG learning approaches. In our extensive experiments, we compare VI-DP-DAG to other differentiable DAG learning baselines on synthetic and real datasets. VI-DP-DAG significantly improves DAG structure and causal mechanism learning while training faster than competitors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2023

DP-Adam: Correcting DP Bias in Adam's Second Moment Estimation

We observe that the traditional use of DP with the Adam optimizer introd...
research
11/07/2022

Recognition of Facets for Knapsack Polytope is DP-complete

DP is a complexity class that is the class of all languages that are the...
research
06/07/2022

Inexact and primal multilevel FETI-DP methods: a multilevel extension and interplay with BDDC

We study a framework that allows to solve the coarse problem in the FETI...
research
06/13/2023

Differentially Private One Permutation Hashing and Bin-wise Consistent Weighted Sampling

Minwise hashing (MinHash) is a standard algorithm widely used in the ind...
research
07/26/2022

Lifelong DP: Consistently Bounded Differential Privacy in Lifelong Machine Learning

In this paper, we show that the process of continually learning new task...
research
02/08/2021

Learning-augmented count-min sketches via Bayesian nonparametrics

The count-min sketch (CMS) is a time and memory efficient randomized dat...
research
12/02/2018

End-to-end Learning of Convolutional Neural Net and Dynamic Programming for Left Ventricle Segmentation

Differentiable programming is able to combine different functions or pro...

Please sign up or login with your details

Forgot password? Click here to reset