Natural-Parameter Networks: A Class of Probabilistic Neural Networks

by   Hao Wang, et al.
The Hong Kong University of Science and Technology

Neural networks (NN) have achieved state-of-the-art performance in various applications. Unfortunately in applications where training data is insufficient, they are often prone to overfitting. One effective way to alleviate this problem is to exploit the Bayesian approach by using Bayesian neural networks (BNN). Another shortcoming of NN is the lack of flexibility to customize different distributions for the weights and neurons according to the data, as is often done in probabilistic graphical models. To address these problems, we propose a class of probabilistic neural networks, dubbed natural-parameter networks (NPN), as a novel and lightweight Bayesian treatment of NN. NPN allows the usage of arbitrary exponential-family distributions to model the weights and neurons. Different from traditional NN and BNN, NPN takes distributions as input and goes through layers of transformation before producing distributions to match the target output distributions. As a Bayesian treatment, efficient backpropagation (BP) is performed to learn the natural parameters for the distributions over both the weights and neurons. The output distributions of each layer, as byproducts, may be used as second-order representations for the associated tasks such as link prediction. Experiments on real-world datasets show that NPN can achieve state-of-the-art performance.


page 1

page 2

page 3

page 4


Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks

Large multilayer neural networks trained with backpropagation have recen...

Streaming Probabilistic Deep Tensor Factorization

Despite the success of existing tensor factorization methods, most of th...

Limitations of Lazy Training of Two-layers Neural Networks

We study the supervised learning problem under either of the following t...

General Backpropagation Algorithm for Training Second-order Neural Networks

The artificial neural network is a popular framework in machine learning...

Bidirectional Inference Networks: A Class of Deep Bayesian Networks for Health Profiling

We consider the problem of inferring the values of an arbitrary set of v...

Deep learning with t-exponential Bayesian kitchen sinks

Bayesian learning has been recently considered as an effective means of ...

Hardness of Learning Neural Networks with Natural Weights

Neural networks are nowadays highly successful despite strong hardness r...

Code Repositories

Please sign up or login with your details

Forgot password? Click here to reset