Training Binary Multilayer Neural Networks for Image Classification using Expectation Backpropagation

03/12/2015
by   Zhiyong Cheng, et al.
0

Compared to Multilayer Neural Networks with real weights, Binary Multilayer Neural Networks (BMNNs) can be implemented more efficiently on dedicated hardware. BMNNs have been demonstrated to be effective on binary classification tasks with Expectation BackPropagation (EBP) algorithm on high dimensional text datasets. In this paper, we investigate the capability of BMNNs using the EBP algorithm on multiclass image classification tasks. The performances of binary neural networks with multiple hidden layers and different numbers of hidden units are examined on MNIST. We also explore the effectiveness of image spatial filters and the dropout technique in BMNNs. Experimental results on MNIST dataset show that EBP can obtain 2.12 test error with real weights, which is comparable to the results of standard BackPropagation algorithm on fully connected MNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/07/2013

Mean Field Bayes Backpropagation: scalable training of multilayer neural networks with binary weights

Significant success has been reported recently using deep neural network...
research
09/04/2018

Chi-Square Test Neural Network: A New Binary Classifier based on Backpropagation Neural Network

We introduce the chi-square test neural network: a single hidden layer b...
research
11/16/2018

DropFilter: A Novel Regularization Method for Learning Convolutional Neural Networks

The past few years have witnessed the fast development of different regu...
research
08/17/2021

KCNet: An Insect-Inspired Single-Hidden-Layer Neural Network with Randomized Binary Weights for Prediction and Classification Tasks

Fruit flies are established model systems for studying olfactory learnin...
research
06/16/2021

Structured DropConnect for Uncertainty Inference in Image Classification

With the complexity of the network structure, uncertainty inference has ...
research
02/28/2018

Avoiding overfitting of multilayer perceptrons by training derivatives

Resistance to overfitting is observed for neural networks trained with e...
research
10/27/2021

Multilayer Lookahead: a Nested Version of Lookahead

In recent years, SGD and its variants have become the standard tool to t...

Please sign up or login with your details

Forgot password? Click here to reset