Nonlocal optimization of binary neural networks

04/05/2022
by   Amir Khoshaman, et al.
0

We explore training Binary Neural Networks (BNNs) as a discrete variable inference problem over a factor graph. We study the behaviour of this conversion in an under-parameterized BNN setting and propose stochastic versions of Belief Propagation (BP) and Survey Propagation (SP) message passing algorithms to overcome the intractability of their current formulation. Compared to traditional gradient methods for BNNs, our results indicate that both stochastic BP and SP find better configurations of the parameters in the BNN.

READ FULL TEXT

page 7

page 8

research
10/27/2021

Deep learning via message passing algorithms based on belief propagation

Message-passing algorithms based on the Belief Propagation (BP) equation...
research
05/20/2015

A Max-Sum algorithm for training discrete neural networks

We present an efficient learning algorithm for the problem of training n...
research
12/20/2011

Using Artificial Bee Colony Algorithm for MLP Training on Earthquake Time Series Data Prediction

Nowadays, computer scientists have shown the interest in the study of so...
research
01/26/2014

Perturbed Message Passing for Constraint Satisfaction Problems

We introduce an efficient message passing scheme for solving Constraint ...
research
04/10/2020

Modeling and Mitigating Errors in Belief Propagation for Distributed Detection

We study the behavior of the belief-propagation (BP) algorithm affected ...
research
07/01/2020

Belief Propagation Neural Networks

Learned neural solvers have successfully been used to solve combinatoria...
research
09/26/2013

Structured Message Passing

In this paper, we present structured message passing (SMP), a unifying f...

Please sign up or login with your details

Forgot password? Click here to reset