Integrating Probabilistic Rules into Neural Networks: A Stochastic EM Learning Algorithm

03/20/2013
by   Gerhard Paaß, et al.
0

The EM-algorithm is a general procedure to get maximum likelihood estimates if part of the observations on the variables of a network are missing. In this paper a stochastic version of the algorithm is adapted to probabilistic neural networks describing the associative dependency of variables. These networks have a probability distribution, which is a special case of the distribution generated by probabilistic inference networks. Hence both types of networks can be combined allowing to integrate probabilistic rules as well as unspecified associations in a sound way. The resulting network may have a number of interesting features including cycles of probabilistic rules, hidden 'unobservable' variables, and uncertain and contradictory evidence.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset