Classified as unknown: A novel Bayesian neural network

01/31/2023
by   Tianbo Yang, et al.
0

We establish estimations for the parameters of the output distribution for the softmax activation function using the probit function. As an application, we develop a new efficient Bayesian learning algorithm for fully connected neural networks, where training and predictions are performed within the Bayesian inference framework in closed-form. This approach allows sequential learning and requires no computationally expensive gradient calculation and Monte Carlo sampling. Our work generalizes the Bayesian algorithm for a single perceptron for binary classification in <cit.> to multi-layer perceptrons for multi-class classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2020

Bayesian Perceptron: Towards fully Bayesian Neural Networks

Artificial neural networks (NNs) have become the de facto standard in ma...
research
10/03/2021

Kalman Bayesian Neural Networks for Closed-form Online Learning

Compared to point estimates calculated by standard neural networks, Baye...
research
11/02/2018

Closed Form Variational Objectives For Bayesian Neural Networks with a Single Hidden Layer

In this note we consider setups in which variational objectives for Baye...
research
10/13/2011

BAMBI: blind accelerated multimodal Bayesian inference

In this paper we present an algorithm for rapid Bayesian analysis that c...
research
09/30/2021

Introducing the DOME Activation Functions

In this paper, we introduce a novel non-linear activation function that ...
research
02/15/2022

Improving the repeatability of deep learning models with Monte Carlo dropout

The integration of artificial intelligence into clinical workflows requi...
research
08/07/2021

Bayesian L_1/2 regression

It is well known that bridge regression enjoys superior theoretical prop...

Please sign up or login with your details

Forgot password? Click here to reset