On Discrete-Continuous Mixtures in Probabilistic Programming: Generalized Semantics and Inference Algorithms

06/06/2018
by   Yi Wu, et al.
0

Despite of the recent successes of probabilistic programming languages (PPLs) in AI applications, PPLs offer only limited support for discrete-continuous mixture random variables. We develop the notion of measure-theoretic Bayesian networks (MTBNs), and use it to provide more general semantics for PPLs with arbitrarily many random variables defined over arbitrary measure spaces. We develop two new general sampling algorithms which are provably correct under the MTBN framework: lexicographic likelihood weighting (LLW) for general MTBNs and lexicographic particle filter (LPF), a specialized algorithm for state space models. We further integrate MTBN into a widely used PPL system, BLOG, and verify the effectiveness of our new inference algorithms through representative examples.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset