Exploiting Qualitative Knowledge in the Learning of Conditional Probabilities of Bayesian Networks

01/16/2013
by   Frank Wittig, et al.
0

Algorithms for learning the conditional probabilities of Bayesian networks with hidden variables typically operate within a high-dimensional search space and yield only locally optimal solutions. One way of limiting the search space and avoiding local optima is to impose qualitative constraints that are based on background knowledge concerning the domain. We present a method for integrating formal statements of qualitative constraints into two learning algorithms, APN and EM. In our experiments with synthetic data, this method yielded networks that satisfied the constraints almost perfectly. The accuracy of the learned networks was consistently superior to that of corresponding networks learned without constraints. The exploitation of qualitative constraints therefore appears to be a promising way to increase both the interpretability and the accuracy of learned Bayesian networks with known structure.

READ FULL TEXT

page 1

page 2

page 3

page 5

page 6

page 7

page 8

page 9

03/27/2013

A Dynamic Approach to Probabilistic Inference

In this paper we present a framework for dynamically constructing Bayesi...
11/20/2012

A Traveling Salesman Learns Bayesian Networks

Structure learning of Bayesian networks is an important problem that ari...
07/04/2012

Learning from Sparse Data by Exploiting Monotonicity Constraints

When training data is sparse, more domain knowledge must be incorporated...
08/08/2022

Uncertain Bayesian Networks: Learning from Incomplete Data

When the historical data are limited, the conditional probabilities asso...
02/06/2019

Bidirectional Inference Networks: A Class of Deep Bayesian Networks for Health Profiling

We consider the problem of inferring the values of an arbitrary set of v...

Please sign up or login with your details

Forgot password? Click here to reset