Designing neural networks that process mean values of random variables

04/29/2010
by   Michael J. Barber, et al.
AIT Austrian Institute of Technology GmbH
0

We introduce a class of neural networks derived from probabilistic models in the form of Bayesian networks. By imposing additional assumptions about the nature of the probabilistic models represented in the networks, we derive neural networks with standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the random variables, that can pool multiple sources of evidence, and that deal cleanly and consistently with inconsistent or contradictory evidence. The presented neural networks capture many properties of Bayesian networks, providing distributed versions of probabilistic models.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/23/2018

Inference in Graded Bayesian Networks

Machine learning provides algorithms that can learn from data and make i...
01/28/2016

Probabilistic Models for Computerized Adaptive Testing: Experiments

This paper follows previous research we have already performed in the ar...
07/21/2014

PGMHD: A Scalable Probabilistic Graphical Model for Massive Hierarchical Data Problems

In the big data era, scalability has become a crucial requirement for an...
06/05/2012

Certain Bayesian Network based on Fuzzy knowledge Bases

In this paper, we are trying to examine trade offs between fuzzy logic a...
02/06/2019

Bidirectional Inference Networks: A Class of Deep Bayesian Networks for Health Profiling

We consider the problem of inferring the values of an arbitrary set of v...
10/01/2018

Probabilistic Meta-Representations Of Neural Networks

Existing Bayesian treatments of neural networks are typically characteri...
09/18/2017

A Probabilistic Framework for Nonlinearities in Stochastic Neural Networks

We present a probabilistic framework for nonlinearities, based on doubly...

Please sign up or login with your details

Forgot password? Click here to reset